Maths Probability

Consider a Markov chain X0,X1,X2,… described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1.


Find the probability that X2=3.

P(X2=3)= - unanswered

Find the probability that the process is in state 3 immediately after the second change of state. (A “change of state" is a transition that is not a self-transition.)

- unanswered

Find (approximately) P(X1000=2∣X1000=X1001).

P(X1000=2∣X1000=X1001)≈ - unanswered
Let T be the first time that the state is equal to 3.

E[T]= - unanswered

Suppose for this part of the problem that the process starts instead at state 2, i.e., X0=2. Let S be the first time by which both states 1 and 3 have been visited.

E[S]

  1. 👍 0
  2. 👎 0
  3. 👁 93
  1. I am in need of the above soln if anybody knew solution please give answers...

    1. 👍 0
    2. 👎 0
    posted by a
  2. you did not provide transition probability graph/matrix when you first posted the problem

    1. 👍 0
    2. 👎 0
    posted by noob
  3. I posted the matrix in my post. I also posted the answers I got wrong when I tried.

    1. 👍 0
    2. 👎 0
  4. I am not sure and I cannot check it until i am sure but for 1 I have 3/32 or 1/4 * 3/8.
    for 2 I have 1/2 but really not sure
    3 could be 1/3
    4 should be 20/3 which the sum of 1/1/p of 1/4 + 3/8
    5 ???

    1. 👍 0
    2. 👎 0
    posted by nugget
  5. Can you please answer?? thanx

    1. 👍 0
    2. 👎 0
  6. I only have one answer left.

    1. 👍 0
    2. 👎 0
    posted by nugget
  7. @gugget
    Only (2) is correct
    which is = 1/2

    1. 👍 0
    2. 👎 0
  8. 1)3/32 correct
    2)0.5 correct
    3)1/3 incorrect
    4)20/3 incorrect
    5

    1. 👍 0
    2. 👎 0
    posted by xxx
  9. Thanks

    1. 👍 0
    2. 👎 0
    posted by nugget

Respond to this Question

First Name

Your Response

Similar Questions

  1. Maths Probability

    Consider a Markov chain X0,X1,X2,É described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1. 1recurs p=.75 1to 2 p= .25 2to 1 p = .375 2 recurs p=.25 2 to 3 p = .375 3 to 2 p = .25 3

    asked by xyz on May 20, 2014
  2. Probability

    Consider a Markov chain X0,X1,X2,… described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1. Find the probability that X2=3. P(X2=3)= - unanswered Find the probability that the

    asked by qwerty on May 26, 2015
  3. Probability

    Consider a Markov chain X0,X1,X2,É described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1. 1recurs p=.75 1to 2 p= .25 2to 1 p = .375 2 recurs p=.25 2 to 3 p = .375 3 to 2 p = .25 3

    asked by annonimous on May 20, 2014
  4. probability

    Consider a Markov chain {X0,X1,…}, specified by the following transition probability graph. P(X2=2∣X0=1)= - unanswered Find the steady-state probabilities ð1, ð2, and ð3 associated with states 1, 2, and 3, respectively.

    asked by alec on May 16, 2015
  5. 12 data management

    3 oeople - john joan and kim throw a ball to each other. there is a probability of 1/3rd that johnwill throw it to joan. probability of 1/2 that joan wil throw it to kim. probability of 1/4th that kim will throw it to john. A.

    asked by Linda on October 8, 2008
  6. math

    A machine is either working (state 1) or not workind (state 2). If it is working one day the probability that it will be broken the next day is 0.1. If it is not working one day the probability that it will be working the next day

    asked by Sally on February 15, 2009
  7. physics

    a one meter chain that weights one newton sits on a flat horizontal table that is one meter from the floor. the coefficient of friction between the chain and the table is 0.15. how much of the chain can dangle off the table until

    asked by david on April 20, 2012
  8. math

    weekly demand data have been analyzed yielding the probability mass fnc. for weekly demad D as follows Pr{D=0}=0.05, Pr{D=1}=0.1, Pr{D=2}=0.2,Pr{D=3}=0.4, Pr{D=4}=0.1,Pr{D=5}=0.1,Pr{D=6}=0.05. Let X be the Markov chain where Xn is

    asked by John on February 26, 2009
  9. Probability

    A fair coin is tossed repeatedly and independently. We want to determine the expected number of tosses until we first observe Tails immediately preceded by Heads. To do so, we define a Markov chain with four states, {S,H,T,HT},

    asked by qwerty on May 19, 2015
  10. Probalility

    A fair coin is tossed repeatedly and independently. We want to determine the expected number of tosses until we first observe Tails immediately preceded by Heads. To do so, we define a Markov chain with four states, {S,H,T,HT},

    asked by Volcom on May 15, 2015

More Similar Questions