A binary communication system is used to send one of two messages:

(i) message A is sent with probability 2/3, and consists of an infinite sequence of zeroes,

(ii) message B is sent with probability 1/3, and consists of an infinite sequence of ones.

The ith received bit is “correct" (i.e., the same as the transmitted bit) with probability 3/4, and is “incorrect" (i.e., a transmitted 0 is received as a 1, and vice versa), with probability 1/4. We assume that conditioned on any specific message sent, the received bits, denoted by Y1,Y2,… are independent.

Note: Enter numerical answers; do not enter '!' or combinations.

1) Find P(Y1=0), the probability that the first bit received is 0.
2)Given that message A was transmitted, what is the probability that exactly 6 of the first 10 received bits are ones? (Answer with at least 3 decimal digits.)
3)Find the probability that the first and second received bits are the same.
4)Given that Y1,…,Y5 were all equal to 0, what is the probability that Y6 is also zero?
5)Find the mean of K, where K=min{i:Yi=1} is the index of the first bit that is 1.

1) 7/12

3) .625
4) .749267
The other questions I don't know the answers to.

2) 210*((1/4)^6)*(3/4)^4

please tell the answer of fifth part.

5) 3.1111

2. 0.01622

From where did yoy get 210 (in the second answer)?

And can please you write the passage for point 5?

5567.678

5) If K=1 then the probability is (1/3)*(3/4)+(2/3)*(1/4)

If K=2 then (1/3)*(1/4)*(3/4)+(2/3)*(3/4)*(1/4)
If K=3 then (1/3)*(1/4)^2(3/4)+(2/3)*(3/4)^2*(1/4)
For K=n then (1/3)*(1/4)^(n-1)(3/4)+(2/3)*(3/4)^(n-1)*(1/4)
We are looking for the expected value of K thus the sum(n=1 to n=inf.)(n*P)
Doing the math we conclude 28/9

To solve these questions, we can use the principles of conditional probability and the given probabilities.

1) To find P(Y1=0), we need to consider both message A and message B probabilities. P(Y1=0) can be calculated as the probability of receiving a 0 if message A is transmitted, multiplied by the probability of transmitting A, plus the probability of receiving a 0 if message B is transmitted, multiplied by the probability of transmitting B.

P(Y1=0) = (2/3 * 3/4) + (1/3 * 1/4) = 1/2

Therefore, the probability that the first bit received is 0 is 1/2.

2) To find the probability that exactly 6 out of the first 10 received bits are ones, given that message A was transmitted, we can use the binomial distribution. The probability of having exactly 6 ones out of 10 is:

P(6 ones | A) = (10 C 6) * (3/4)^6 * (1/4)^4 ≈ 0.05 (rounded to at least 3 decimal digits)

Therefore, the probability is approximately 0.05.

3) To find the probability that the first and second received bits are the same, we can consider the two possibilities: both bits being 0 or both bits being 1. Let's calculate both cases:

P(Y1=0 and Y2=0) = P(Y1=0) * P(Y2=0 | Y1=0) = (1/2) * (3/4) = 3/8

P(Y1=1 and Y2=1) = P(Y1=1) * P(Y2=1 | Y1=1) = (1/2) * (3/4) = 3/8

The probability that the first and second received bits are the same is the sum of these two probabilities:

P(Y1 and Y2 are the same) = P(Y1=0 and Y2=0) + P(Y1=1 and Y2=1) = 3/8 + 3/8 = 6/8 = 3/4

Therefore, the probability is 3/4.

4) To find the probability that Y6 is also zero given that Y1 to Y5 were all equal to 0, we can use the fact that the received bits are independent. So, the probability of Y6 being zero is the probability of receiving a zero, given that Y1 to Y5 are all equal to zero:

P(Y6=0 | Y1=Y2=Y3=Y4=Y5=0) = P(Y6=0) = 1/2

Therefore, the probability is 1/2.

5) To find the mean of K, where K=min{i:Yi=1}, we need to calculate the expected value of K. We can express K as a sum of indicator random variables, where the indicator variable Ii=1 if Yi=1 and Ii=0 otherwise.

K = I1 + I2 + I3 + ...

To calculate the expected value of K, we can use linearity of expectation. Since the Yi variables are independent, we have:

E(K) = E(I1) + E(I2) + E(I3) + ...

The probability that Yi=1 is 1/2. So, the expected value of each indicator variable is:

E(Ii) = 1 * P(Yi=1) + 0 * P(Yi=0) = 1/2

Since every Ii has the same expected value, we can simplify:

E(K) = E(I1) + E(I2) + E(I3) + ... = n * E(Ii) = n/2

Therefore, the mean of K is n/2, where n is the index of the first bit that is 1.

If you have any further questions, feel free to ask!