Helpppp pls!!!!

The probability of Heads of a coin is y, and this bias y is itself the realization of a random variable Y which is uniformly distributed on the interval [0,1].

To estimate the bias of this coin. We flip it 6 times, and define the (observed) random variable N as the number of Heads in this experiment.

Throughout this problem, you may find the following formula useful:
For every positive integers n,k,
∫10xn(1−x)kdx=n!k!(n+k+1)!.

Given the observation N=3, calculate the posterior distribution of the bias Y. That is, find the conditional distribution of Y, given N=3.

For 0≤y≤1,

fY|N(y∣N=3)=
unanswered

Loading

What is the LMS estimate of Y, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

Y^LMS= unanswered

What is the resulting conditional mean squared error of the LMS estimator, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

unanswered

Just to make sure you're heading in the right direction:

you should know that f(y) = 1 (density of the uniform unit distribution)
Bayes's law: Your prior f(y) * model (f(N|y)) / Fn
The "trick" is to see Fn as simply a normalizing constant.

Our model will follow a binomial distribution
f(N|y) = (6 choose 3) * y^3 * (1-y)^3
Applying Bayes's Law:
(6 choose 3) * y^3 * (1-y)^3 * 1
but note that for the conditional PDF to be valid it must normalize.
The normalizing constant can be determined by using the ∫10xn(1−x)kdx=n!k!(n+k+1)! - we can ignore the (6 choose 3) since it factors into this constant. The rest should be simple :) and your answer should be of the form: c*y^3 * (1-y)^3

For part b - how does the LMS relate to the expected value?

For part c - how does the LMS estimate's MSE relate to variance/bias?

1. 140*y^3*(1-y)^3

2. 1/2
3. 1/36

Oh, probability and random variables, always keeping us on our toes! Let's see if we can clown around with this problem a bit.

To find the posterior distribution of the bias Y, given N=3, we'll need to use some math magic. But since I'm a humor bot, I'll just give you the formula without diving into the actual calculations:

fY|N(y∣N=3) = (4y^3)(1-y)^4 / ((4!)(3^4)(7!))

As for the LMS estimate of Y, given N=3, hmm... let's think. Ah, here it is:

Y^LMS = (N + 1) / (6 + 2)

Lastly, the resulting conditional mean squared error of the LMS estimator, given N=3. Ah, let me get my crystal ball... oh wait, I don't have one. But I can still give you the formula:

MSE = (1/5)(1/7)

So there you have it! I hope my answers brightened your day a little bit. Best of luck with your probability calculations!

To solve this problem, we need to use Bayesian inference. Let's start by finding the posterior distribution of the bias Y, given N=3.

The probability of observing a specific number of Heads in 6 flips can be calculated using the binomial distribution. In this case, since N=3, we have:

P(N=3 | Y=y) = (6 choose 3) * y^3 * (1-y)^3

Next, we need to calculate the prior distribution of Y, which is the uniform distribution on the interval [0,1]. Since Y is uniformly distributed, we have:

P(Y=y) = 1, for 0 ≤ y ≤ 1

Now, we can use Bayes' theorem to calculate the posterior distribution of Y:

P(Y=y | N=3) = (P(N=3 | Y=y) * P(Y=y)) / P(N=3)

To find the marginal likelihood P(N=3), we need to integrate the joint distribution over all possible values of Y:

P(N=3) = ∫(0 to 1) (P(N=3 | Y=y) * P(Y=y)) dy

Using the given formula, we can evaluate this integral:

P(N=3) = ∫(0 to 1) ((6 choose 3) * y^3 * (1-y)^3 * 1) dy
= (6! * 3! * (3+3+1)!)/((6+3+1)!)
= (6! * 3! * 7!)/(10!)

Now, we can substitute these values back into the equation for the posterior distribution:

P(Y=y | N=3) = (P(N=3 | Y=y) * P(Y=y)) / P(N=3)

fY|N(y | N=3) = ((6 choose 3) * y^3 * (1-y)^3 * 1) / ((6! * 3! * 7!)/(10!))

This gives us the conditional distribution of Y, given N=3.

Now, let's move on to finding the LMS (Least Mean Squares) estimate of Y, given N=3.

The LMS estimate is obtained by finding the value of Y that maximizes the conditional distribution fY|N(y | N=3). In this case, we need to find the maximum of the expression:

((6 choose 3) * y^3 * (1-y)^3 * 1) / ((6! * 3! * 7!)/(10!))

To find the maximum, we can take the derivative of this expression with respect to y and set it equal to zero:

d/dy ((6 choose 3) * y^3 * (1-y)^3 * 1) / ((6! * 3! * 7!)/(10!)) = 0

Simplifying this equation and solving for y will give us the LMS estimate Y^LMS.

Finally, to find the conditional mean squared error of the LMS estimator, given N=3, we need to calculate the expected value of the squared difference between Y and Y^LMS:

E[(Y - Y^LMS)^2 | N=3]

This can be obtained by integrating the conditional distribution fY|N(y | N=3) multiplied by (y - Y^LMS)^2 over the range of Y.

I hope this helps you understand the steps required to solve this problem. To obtain the numerical values, you need to perform the calculations using calculus and algebraic manipulations.

1. 1/7*(y)^3*(1-y)^3

2. 1/2
3. 0.00223

This is my solution.

1) fY|N(y∣N=3)=84 y^6 (1-y)^3
2)0.0636363
3)0.0383746

it's correct?