To estimate the bias of this coin. We flip it 6 times, and define the (observed) random variable N as the number of Heads in this experiment.

Throughout this problem, you may find the following formula useful: For every positive integers n, k ,

∫10xn(1−x)kdx=n!k!(n+k+1)!.

Given the observation N=3 , calculate the posterior distribution of the bias Y . That is, find the conditional distribution of Y , given N=3 .

For 0≤y≤1 ,

fY|N(y∣N=3)=?

What is the LMS estimate of Y , given N=3 ?

(Enter enter image description herean exact expression or a decimal accurate to at least 2 decimal places.)

YLMS=?

What is the resulting conditional mean squared error of the LMS estimator, given N=3 ?
(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

MSE(YLMS|N=3)=?

Answer:
fY|N(y∣N=3)=6y^3(1-y)^3
YLMS=0.5
MSE(YLMS|N=3)=0.125

To calculate the posterior distribution of the bias Y given N=3, we need to use Bayes' theorem and assume a prior distribution for Y. Let's assume a Beta(α, β) prior distribution for Y.

The posterior distribution of Y given N=3 is given by:

fY|N(y|N=3) = (fN|Y(3|y)) * (fY(y)) / (fN(3))

To find the conditional distribution of N given Y, we use the binomial distribution:

fN|Y(n|y) = (6 choose n) * (y^n) * ((1-y)^(6-n))

To find the conditional distribution of Y, we use the prior distribution:

fY(y) = Beta(α, β) = (1/B(α, β)) * (y^(α-1)) * ((1-y)^(β-1))

And the marginal distribution of N is:

fN(n) = (6 choose n) * (∫_0^1 (y^n) * ((1-y)^(6-n)) * (1/B(α, β)) * (y^(α-1)) * ((1-y)^(β-1)) * dy)

The LMS estimate of Y is the mean of the posterior distribution:

YLMS = ∫_0^1 (y * fY|N(y|N=3)) * dy

To find the conditional mean squared error of the LMS estimator, we use the conditional variance:

MSE = ∫_0^1 ((y - YLMS)^2 * fY|N(y|N=3)) * dy

To calculate the exact expressions, we need to define the values of α and β for the Beta prior distribution.

To calculate the posterior distribution of the bias Y, given that N=3, we can use Bayes' theorem. The posterior distribution is given by:

fY|N(y|N=3) = fN|Y(N=3|y) * fY(y) / fN(N=3)

Where:
- fY|N(y|N=3) is the posterior distribution of Y given N=3.
- fN|Y(N=3|y) is the likelihood function, representing the probability of observing N=3 given a specific bias y.
- fY(y) is the prior distribution, representing our initial belief about the bias y.
- fN(N=3) is the marginal distribution, representing the probability of observing N=3 regardless of the bias.

The likelihood function fN|Y(N=3|y) can be calculated using the binomial distribution formula:

fN|Y(N=3|y) = C(6, 3) * yn * (1-y)^(6-3)

Where C(6, 3) represents the binomial coefficient, which can be calculated using the formula in your provided equation:

C(6, 3) = 6! / (3! * (6+3)!)

Let's calculate this:

C(6, 3) = 6! / (3! * (6+3)!)
= 6! / (3! * 9!)
= 6! / (3! * 9 * 8 * 7 * 6 * 5 * 4 * 3 * 2 * 1)
= 20

Now we can substitute this value into the likelihood function:

fN|Y(N=3|y) = 20 * yn * (1-y)^(6-3)
= 20y^3 * (1-y)^3

Now we need to find the prior distribution fY(y). Assuming no prior knowledge about the bias, we can use a uniform distribution, which means fY(y) = 1 for 0 ≤ y ≤ 1, and fY(y) = 0 otherwise.

Finally, we need to calculate the marginal distribution fN(N=3). This represents the probability of observing N=3, regardless of the bias. We can calculate this by integrating the joint distribution fN,Y(n,y) over all possible values of y, i.e., integrating from 0 to 1:

fN(N=3) = ∫0^1 fN,Y(n=3, y) dy

To calculate this integral, we can use the provided formula:

∫10xn(1−x)kdx = n!k!(n+k+1)!

In this case, n=3 and k=3:

∫10x3(1−x)3dx = 3!3!(3+3+1)!
= 6! / (3! * 3! * 7!)
= 6! / (3! * 3! * 7 * 6 * 5 * 4 * 3 * 2 * 1)
= 20/ (7 * 6 * 5 * 4)
= 0.00793651

Now we can substitute all these values into the posterior distribution formula:

fY|N(y|N=3) = fN|Y(N=3|y) * fY(y) / fN(N=3)
= (20y^3 * (1-y)^3) * 1 / 0.00793651
≈ 1.2468 * y^3 * (1-y)^3

Next, we can calculate the LMS estimate of Y, given N=3. This estimate is given by the conditional mean of the posterior distribution:

YLMS = ∫0^1 y * fY|N(y|N=3) dy

Plugging in the posterior distribution we derived earlier:

YLMS = ∫0^1 y * (1.2468 * y^3 * (1-y)^3) dy

To calculate this integral, we can use numerical methods or a symbolic computation software.

Lastly, the conditional mean squared error of the LMS estimator, given N=3, can be calculated as the expected value of (Y - YLMS)^2:

E[(Y - YLMS)^2] = ∫0^1 (y - YLMS)^2 * fY|N(y|N=3) dy

Again, we can use numerical methods or symbolic computation software to calculate this integral.

Please note that calculating the exact values of YLMS and the conditional mean squared error may require more advanced mathematical techniques.