Markov plays a game for three turns. On each turn, he either rolls a fair, six sided die or flips a fair coin. If he rolls a 1 or 2 on the die, he will switch to the coin on the next turn, and if he flips a tails on the coin, he will switch to the die on the next turn. If Markov starts by rolling the die, what is the probability that he will flip the coin on the third turn?

What I did
--
On the first turn, there is 1/3 chance of going to the coin and 2/3 chance of going to the die.
On the second turn it is the same probability for the die.
The coin has 1/2 and 1/2 for going to the die and staying with the coin.

So is it 1/6 because 1/2*1/3=1/6?
I don't think this is correct, so I need someone to confirm. If this is correct, please type so. If it isn't, please put your solution.
2/9 is not correct.

Let

D be the outcome of a die at the end of a turn
C be the outcome of a coin at the end of a turn

Make a tree diagram
we start with D
we have either a D or a C at the end of the first toss, two branches
from the D, we have a D or C at the end of the 2nd toss (the beginning of the third toss)
we are now looking at the end of 4 branches
D:DD ---> (2/3)(2/3) = 4/9
D:DC ---> (2/3)(1/3) = 2/9
D:CD ---> (1/3)(1/2) = 1/6
D:CC ---> (1/3)(1/2) = 1/6 , notice that 4/9+2/9+1/6+/6 = 1

we need the cases that end in C
prob = 2/9 +1/6 = 7/18

Damon, if we follow your interpretation, the outcomes would be

dddd
dddc --- (2/3)(2/3)(1/3)
ddcd
ddcc --- (2/3)(1/3)(1/2)
dcdd
dcdc -- (1/3)(1/2)(1/3)
dccd
dccc -- (1/3)(1/2)(1/2)
then there would be 4 outcomes that end with a coin, for a total of
4/27 + 1/9 + 1/18 + 1/12 = 43/108

I think you found the coutcomes at the 4th toss

To solve this problem, we can use the concept of a Markov chain. A Markov chain is a mathematical model used to represent a sequence of events where the probability of transitioning from one event to another depends only on the current event.

In this case, we have three possible states for each turn: rolling the die, flipping the coin, or transitioning. Let's label these states as D, C, and T respectively.

We can represent the transitions between states using a transition probability matrix. In this matrix, the (i, j) entry represents the probability of transitioning from state i to state j on a given turn.

The transition probability matrix for this game is:

D C T
D 2/3 1/3 0
C 1/2 0 1/2
T 0 1 0

To find the probability of flipping the coin on the third turn, we can calculate the probability of being in the state C on the third turn.

To do this, we need to compute the probabilities of being in each state on the second turn, and then use those probabilities to calculate the probabilities on the third turn.

Starting with the first turn:
- The probability of starting in state D is 1 since Markov starts by rolling the die.

On the second turn:
- The probability of transitioning to D is 2/3 (from D to D).
- The probability of transitioning to C is 1/3 (from D to C).

On the third turn:
- The probability of transitioning to D is 2/3 (from D to D, as the die is rolled on the third turn).
- The probability of transitioning to C is 1/3 (from D to C, as the coin is flipped on the third turn).

Given this, the probability of being in state C on the third turn is:

P(C_3) = P(D_1) * P(C_2) * P(C_3)
= 1 * (1/3) * (1/3)
= 1/9

Therefore, the probability that Markov will flip the coin on the third turn is 1/9.

I get

die - coin - coin - coin = 1/3 * 1/2 *1/2 = 1/12
die - die - die - coin = 2/3 * 2/3 * 1/2 = 4/18 = 2/9
so
1/12 + 2/9 = 3/36 + 8/36 = 11/36

oh

+
die - coin -die - coin = 1/3* 1/2 *1/3 = 1/18
11/36 + 1/18 = 11/36 + 2/36 = 13/36