6. Biased coin

Bookmark this page
Problem 5. Biased coin
5.0 points possible (graded, results hidden)
We are given a biased coin, where the probability of Heads is q. The bias q
is itself the realization of a random variable Q which is uniformly distributed on the interval [0,1]. We want to estimate the bias of this coin. We flip it
5 times, and define the (observed) random variable N as the number of Heads in this experiment.

Given the observation N=3, calculate the posterior distribution of the bias
Q. That is, find the conditional distribution of Q, given N=3.

For 0≤q≤1,

Question 1 --> fQ|N(q∣N=3)= ?

Question 2 --> What is the LMS estimate of Q, given N=3?

Question 3 --> What is the resulting conditional mean squared error of the LMS estimator, given N=3?

To find the conditional distribution of Q given N=3, we can use Bayes' theorem.

Let's define some variables:
- Q = the bias of the coin
- N = the number of Heads observed
- q = a specific value of Q
- fQ(q) = the prior distribution of Q

Question 1: fQ|N(q|N=3) = ?

Using Bayes' theorem, the conditional distribution of Q given N=3, can be calculated as:

fQ|N(q|N=3) = P(N=3|q) * fQ(q) / P(N=3)

P(N=3|q) = (5 choose 3) * q^3 * (1-q)^2
This is the probability of observing 3 Heads given a specific value of Q.

fQ(q) = 1
Since Q is uniformly distributed on the interval [0,1], the prior distribution is a constant 1.

P(N=3) = integral from 0 to 1 of [P(N=3|q) * fQ(q)] dq

To find P(N=3), we need to integrate the product of the probability of observing 3 Heads and the prior distribution over all possible values of Q.

Question 2: What is the LMS estimate of Q given N=3?

The LMS (Least Mean Squares) estimate of Q given N=3 is the posterior mean of Q given the observed value of N.

LMS estimate of Q = E[Q|N=3]
To find the LMS estimate, we need to calculate the conditional expectation of Q given N=3, which is the posterior mean of Q.

Question 3: What is the resulting conditional mean squared error of the LMS estimator, given N=3?

The conditional mean squared error of the LMS estimator given N=3 is the conditional variance of Q given the observed value of N.

Conditional mean squared error = var[Q|N=3]
To find the conditional mean squared error, we need to calculate the conditional variance of Q given N=3.

To calculate the posterior distribution of the bias Q, given the observation N=3, we need to use Bayes' theorem. Bayes' theorem states:

P(Q|N) = P(N|Q) * P(Q) / P(N)

Where:
- P(Q|N) is the posterior distribution of Q given N,
- P(N|Q) is the likelihood function, which is the probability of observing N given a certain value of Q,
- P(Q) is the prior distribution of Q, and
- P(N) is the marginal probability of observing N.

Question 1: To find the conditional distribution fQ|N(q|N=3), we need to calculate P(Q|N=3) using Bayes' theorem.

P(Q|N=3) = P(N=3|Q) * P(Q) / P(N=3)

Since we are given that N=3, we can substitute it in the equation:

P(Q|N=3) = P(N=3|Q) * P(Q) / P(N=3)

Question 2: The LMS (Least Mean Squares) estimate of Q, given N=3, is the value of Q that minimizes the mean squared error (MSE) between the estimated value of Q and the true value of Q. To find the LMS estimate of Q, we need to calculate the conditional mean of Q given N=3. This can be done by finding the conditional expectation E[Q|N=3].

LMS estimate of Q = E[Q|N=3]

Question 3: The resulting conditional mean squared error of the LMS estimator, given N=3, is the mean squared error (MSE) between the LMS estimate of Q and the true value of Q. The MSE is calculated as the expectation of the square of the difference between the estimate and the true value.

Conditional MSE = E[(LMS estimate of Q - true value of Q)^2]

Answer 1: fQ|N(q∣N=3)= (5q^3)(1-q)^2

Answer 2: The LMS estimate of Q, given N=3 is q=3/5.
Answer 3: The resulting conditional mean squared error of the LMS estimator, given N=3 is 1/25.