In a medical prognosis model, there are three states for a person: well, ill and dead. We are given that the prob. one states well given that the person is well is 0.3. The prob. one becomes ill given that one is well is 0.5. The prob. one states ill given that one is ill is 0.5 and the probability one becomes well if that person is ill is 0.5.

Find the transition matrix P for the Markov chain.

The matrix look like this:


W I D
W [0.3 0.5 0]
I [0.5 0.5 0]
D [0.2 0 1]

Though, it states that it is impossible for an ill person to die. Are you sure the question is correct?

To find the transition matrix P for the Markov chain, we need to define the probabilities of transitioning between the different states. In this case, we have three states: well, ill, and dead.

Let's represent the states as follows:
State 1: Well
State 2: Ill
State 3: Dead

Now, we can define the transition probabilities based on the given information:

P11: Probability of transitioning from well to well = 0.3
P12: Probability of transitioning from well to ill = 0.5
P13: Probability of transitioning from well to dead = 1 - (P11 + P12) = 1 - (0.3 + 0.5) = 0.2

P21: Probability of transitioning from ill to well = 0.5
P22: Probability of transitioning from ill to ill = 0.5
P23: Probability of transitioning from ill to dead = 1 - (P21 + P22) = 1 - (0.5 + 0.5) = 0

P31: Probability of transitioning from dead to well = 0
P32: Probability of transitioning from dead to ill = 0
P33: Probability of transitioning from dead to dead = 1

Now, we can arrange these transition probabilities in a matrix form:

P = | P11 P12 P13 |
| P21 P22 P23 |
| P31 P32 P33 |

Substituting the values, we get:

P = | 0.3 0.5 0.2 |
| 0.5 0.5 0 |
| 0 0 1 |

Therefore, the transition matrix P for the Markov chain is:

P = | 0.3 0.5 0.2 |
| 0.5 0.5 0 |
| 0 0 1 |