A fair coin is tossed repeatedly and independently. We want to determine the expected number of tosses until we first observe Tails immediately preceded by Heads. To do so, we define a Markov chain with four states, {S,H,T,HT}, where S is a starting state, H indicates Heads on the current toss, T indicates Tails on the current toss (without Heads on the previous toss), and HT indicates Heads followed by Tails over the last two tosses. This Markov chain is illustrated below:

1. What is the expected number of tosses until we first observe Tails immediately preceded by Heads? Hint: Solve the corresponding mean first passage time problem for our Markov chain.

- unanswered
2. Assuming that we have just observed Tails immediately preceded by Heads, what is the expected number of additional tosses until we next observe Tails immediately preceded by Heads?

- unanswered
Next, we want to answer similar questions for the event that Tails is immediately preceded by Tails. Set up a new Markov chain from which you can calculate the expected number of tosses until we first observe Tails immediately preceded by Tails.

3. What is the expected number of tosses until we first observe Tails immediately preceded by Tails?

- unanswered
4. Assuming that we have just observed Tails immediately preceded by Tails, what is the expected number of additional tosses until we again observe Tails immediately preceded by Tails?

- unanswered

1) 4

2) 4
3) 6
4) 4

To calculate the expected number of tosses until we first observe Tails immediately preceded by Heads, we need to solve the corresponding mean first passage time problem for the given Markov chain. Let's go through the steps.

Step 1: Define the transition probabilities of the Markov chain.
For the Markov chain with states {S, H, T, HT}, the transition probabilities can be represented in a transition probability matrix P as follows:

S H T HT
S 0 0 0 0
H 0.5 0 0.5 0
T 0 0 0.5 0.5
HT 0 0.5 0 0.5

Step 2: Define the absorbing state.
In this case, the absorbing state is HT, which means once we reach state HT, we stop tossing the coin.

Step 3: Set up the system of equations for the mean first passage times.
Let M(i) represent the mean first passage time from state i to the absorbing state HT. The mean first passage time satisfies the following system of equations:

M(S) = 1 + 0.5M(T) + 0.5M(H)
M(H) = 1 + 0.5M(S) + 0.5M(HT)
M(T) = 1 + 0.5M(T) + 0.5M(S)
M(HT) = 0

Step 4: Solve the system of equations.
We can solve the system of equations to find the values of M(S), M(H), and M(T). But in this case, we are interested in finding the mean first passage time from the starting state S to the absorbing state HT. So we need to solve for M(S).

From the first equation, we have:

M(S) = 1 + 0.5M(T) + 0.5M(H)

Substituting the second equation into the first equation, we get:

M(S) = 1 + 0.5M(T) + 0.5(1 + 0.5M(S) + 0.5M(HT))

Substituting M(HT) = 0, we have:

M(S) = 1 + 0.5M(T) + 0.5(1 + 0.5M(S))

Simplifying the equation, we get:

0.5M(S) = 2 + 0.5M(T)

Substituting the third equation into the above equation, we get:

0.5M(S) = 2 + 0.5(1 + 0.5M(T))

Substituting M(T) = 1 + 0.5M(S), we have:

0.5M(S) = 2 + 0.5(1 + 0.5(1 + 0.5M(S)))

Simplifying the equation, we get:

0.5M(S) = 4.5 + 0.375M(S)

0.125M(S) = 4.5

M(S) = 4.5 / 0.125

M(S) = 36

Therefore, the expected number of tosses until we first observe Tails immediately preceded by Heads is 36.

Now, using a similar approach, you can apply the same steps to find the expected number of tosses until we first observe Tails immediately preceded by Tails and the expected number of additional tosses until we again observe Tails immediately preceded by Tails. Just make sure to define the new Markov chain and modify the system of equations accordingly.