The probability of Heads of a coin is y, and this bias y is itself the realization of a random variable Y which is uniformly distributed on the interval [0,1].

To estimate the bias of this coin. We flip it 6 times, and define the (observed) random variable N as the number of Heads in this experiment.

Throughout this problem, you may find the following formula useful:
For every positive integers n,k,
∫10xn(1−x)kdx=n!k!(n+k+1)!.


Given the observation N=3, calculate the posterior distribution of the bias Y. That is, find the conditional distribution of Y, given N=3.

For 0≤y≤1,

fY|N(y∣N=3)=
unanswered

Loading

What is the LMS estimate of Y, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

Y^LMS= unanswered

What is the resulting conditional mean squared error of the LMS estimator, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

unanswered

Why did the coin go to therapy? Because it had a lot of Heads and Tails!

Now, let's tackle the problem at hand. We want to find the conditional distribution of the bias Y, given N=3. To do that, we can use Bayes' theorem. But first, let's find the prior distribution of Y.

Since Y is uniformly distributed on the interval [0,1], we have fY(y) = 1 for 0≤y≤1, and 0 otherwise.

Next, let's find the likelihood function fN|Y(n|y), which represents the probability of observing N=3 given a specific value of the bias Y=y.

If we flip the coin 6 times, the probability of getting n Heads is given by the binomial distribution:

fN|Y(n|y) = (6 choose n) * y^n * (1-y)^(6-n), for n=0,1,2,3,4,5,6.

Now, we can use Bayes' theorem to find the posterior distribution of Y:

fY|N(y|N=3) = (fN|Y(3|y) * fY(y)) / integral(fN|Y(n|y) * fY(y), from 0 to 1).

Let's calculate the integral:

∫0^1 fN|Y(n|y) * fY(y)dy = ∫0^1 (6 choose 3) * y^3 * (1-y)^(6-3) * 1 dy.

Using the formula given in the problem, we find:

∫0^1 fN|Y(n|y) * fY(y)dy = (6! * 3! * (3+6+1)!) / (3! * 3! * (6+1+1)!) = 420 / 3360 = 1/8.

Now, let's find fY|N(y|N=3):

fY|N(y|N=3) = (fN|Y(3|y) * fY(y)) / ∫0^1 fN|Y(n|y) * fY(y)dy.

Substituting the values we found earlier, we have:

fY|N(y|N=3) = [(6 choose 3) * y^3 * (1-y)^(6-3) * 1] / (1/8).

Simplifying, we get:

fY|N(y|N=3) = 56 * y^3 * (1-y)^3 for 0≤y≤1.

Therefore, the conditional distribution of Y, given N=3, is fY|N(y|N=3) = 56 * y^3 * (1-y)^3 for 0≤y≤1.

Now, let's move on to the LMS estimate of Y, given N=3. The LMS estimate is the mean of the conditional distribution, which is the integral of y*fY|N(y|N=3) over the interval [0,1].

Calculating the integral:

∫0^1 y * (56 * y^3 * (1-y)^3) dy = 56 * ∫0^1 y^4 * (1-y)^3 dy.

Using the same formula as before, we find:

∫0^1 y * (56 * y^3 * (1-y)^3) dy = 56 * (4! * 3! * (4+3+1)!) / (5! * 4! * (7+1+1)!) = 56 * 24 / 6720 = 1/10.

Therefore, the LMS estimate of Y, given N=3, is Y^LMS = 1/10.

Finally, let's calculate the resulting conditional mean squared error (CMSE) of the LMS estimator.

The CMSE is the expected value of the squared difference between the true bias Y and the LMS estimate Y^LMS. It can be calculated as the integral of (Y - Y^LMS)^2 * fY|N(y|N=3) over the interval [0,1].

Calculating the integral:

∫0^1 (y - 1/10)^2 * (56 * y^3 * (1-y)^3) dy.

Simplifying and using the same formula as before, we find:

∫0^1 (y - 1/10)^2 * (56 * y^3 * (1-y)^3) dy = 56 * (4! * 3! * (4+2+1)!) / (6! * 3! * (7+1+1)!) = 56 * 20 / 15120 = 7/252.

Therefore, the resulting CMSE of the LMS estimator, given N=3, is 7/252.

I hope this helps! Let me know if you have any other questions, or if you want to hear a coin joke!

To calculate the posterior distribution of the bias Y, given N=3, we need to use Bayes' theorem. Bayes' theorem states that:

fY|N(y|N=3) = fN|Y(N=3|y) * fY(y) / fN(N=3)

We know that fN|Y(N=3|y) follows a binomial distribution, with parameters n=6 and p=y. Therefore, fN|Y(N=3|y) = (6 choose 3) * y^3 * (1-y)^3.

Since Y follows a uniform distribution on the interval [0,1], fY(y) = 1 for 0≤y≤1.

To find fN(N=3), we need to calculate the integral of fN|Y(N=3|y) * fY(y) over the interval [0,1].

Using the formula given, we have:

fN(N=3) = ∫[0,1] (6 choose 3) * y^3 * (1-y)^3 * 1 dy
= (6! * 3!) / ((3+3+1)!) * ∫[0,1] y^3 * (1-y)^3 dy

To calculate the integral, we can expand it using binomial coefficients:

∫[0,1] y^3 * (1-y)^3 dy
= (1/35) * [∫[0,1] y^3 * (1-y)^3 dy]
= (1/35) * ∫[0,1] (y^6 - 3y^5 + 3y^4 - y^3) dy
= (1/35) * (1/7 - 3/6 + 3/5 - 1/4)
= 1/140

Now we can substitute all these values back into Bayes' theorem:

fY|N(y|N=3) = fN|Y(N=3|y) * fY(y) / fN(N=3)
= [(6 choose 3) * y^3 * (1-y)^3 * 1] / [1/140]
= 140 * (6 choose 3) * y^3 * (1-y)^3

Therefore, the conditional distribution of Y, given N=3, is:

fY|N(y|N=3) = 140 * (6 choose 3) * y^3 * (1-y)^3

To find the LMS estimate of Y, given N=3, we can find the mode of the posterior distribution. The mode corresponds to the y value with the highest probability density. In this case, we need to find the value of y that maximizes the function 140 * (6 choose 3) * y^3 * (1-y)^3.

Since this is a mathematical expression, it is difficult to determine the exact y value without further simplification. Therefore, the LMS estimate of Y, given N=3, cannot be determined in this context.

To calculate the posterior distribution of the bias Y given the observation N=3, we need to use Bayes' theorem. Bayes' theorem allows us to update our prior belief about Y (which is a uniform distribution on the interval [0,1]) based on the observed data N=3.

Let's denote the bias Y as y, the observation N as n, and the prior distribution of Y as fY(y). According to Bayes' theorem, the posterior distribution of Y given N=3 can be calculated as:

fY|N(y|N=3) = (fN|Y(n|y) * fY(y))/ fN(n)

where:
fN|Y(n|y) is the likelihood of observing N=n given Y=y,
fY(y) is the prior distribution of Y, and
fN(n) is the marginal likelihood of observing N=n.

In this case, the likelihood function fN|Y(n|y) represents the probability of observing exactly n=3 Heads in 6 coin flips given the bias y.

The likelihood function can be calculated using the binomial distribution formula:

fN|Y(n|y) = (6 choose 3) * (y^3) * ((1-y)^(6-3))

Simplifying further, we have:

fN|Y(n|y) = 20 * (y^3) * ((1-y)^3)

The marginal likelihood fN(n) represents the probability of observing exactly n=3 Heads in 6 coin flips, averaged over all possible values of the bias y. The marginal likelihood can be calculated by integrating the likelihood function over the entire range of possible values of y (0 to 1):

fN(3) = ∫[0,1] fN|Y(3|y) * fY(y) dy

To find the LMS estimate of Y given N=3, we can calculate the conditional mean of Y given N=3. The conditional mean can be calculated as:

Y^LMS = ∫[0,1] y * fY|N(y|N=3) dy

Finally, to find the resulting conditional mean squared error of the LMS estimator given N=3, we can calculate the conditional mean squared error using the formula:

Conditional Mean Squared Error = ∫[0,1] (Y - Y^LMS)^2 * fY|N(y|N=3) dy

You can use the formulas and concepts above to perform the calculations and find the answers to the questions.

This is my solution.

1) fY|N(y∣N=3)=84 y^6 (1-y)^3
2)0.0636363
3)0.0383746

it's correct?