In this problem, we consider an application of posterior statistics. Suppose that we have five loaded coins; the probability of each landing heads is 0.2, 0.4, 0.4, 0.6, and 0.8. The coins are placed in a bag, and one coin is drawn completely at random. Let the probability that the coin drawn comes up heads on a given toss be \lambda _0. Our goal is to infer something about the probability that the coin picked will land heads, and we would extract information by flipping the coin n times and recording the outcomes of the tosses. In a Bayesian model, given the description of the situation, what is the prior distribution of the parameter of interest \lambda, which we define as probability that the coin chosen lands heads?

Enter the prior probabilities for \lambda = 0.2, \lambda = 0.4, \lambda = 0.6, \lambda = 0.8, as a vector \begin{pmatrix} \mathbf{P}(\lambda =0.2)& \mathbf{P}(\lambda =0.4)& \mathbf{P}(\lambda =0.6)& \mathbf{P}(\lambda =0.8) \end{pmatrix}. For example, if \, \mathbf{P}(\lambda =0.2)=0.5,\, \mathbf{P}(\lambda =0.4)=0.1,\, \mathbf{P}(\lambda =0.6)=0.1,\, \mathbf{P}(\lambda =0.8)=0.3,\, then enter [0.5,0.1,0.1,0.3]. Note the components are separated by commas, and the vector is enclosed by square brackets.

\begin{pmatrix} \mathbf{P}(\lambda =0.2)& \mathbf{P}(\lambda =0.4)& \mathbf{P}(\lambda =0.6)& \mathbf{P}(\lambda =0.8) \end{pmatrix}=\quad
unanswered
[Math Processing Error]
Let the observations be X_1, X_2, \cdots, X_ n, which are modelled as Bernoulli random variables indicating whether a head was tossed on each of the n tosses. Find a general expression for the likelihood function L_ n( X _1, X _2, \cdots, X _{n} | \lambda ) in terms of \lambda, n, and \displaystyle \sum _{i=1}^{n} X_ i.

(Enter Sigma_i(X_i) for \displaystyle \sum _{i=1}^{n} X_ i. Do not worry if the parser does not render properly; the grader works independently. If you wish to have proper rendering, enclose Sigma_i(X_i) by brackets. )

For this problem, write your answer in proportionality notation such that when \lambda =0.5, the value of the likelihood function is 0.5^ n regardless of the value of the X_ i's.

L_ n( X _1, X _2, \cdots, X _{n} | \lambda )\propto \quad

L_ n( X _1, X _2, \cdots, X _{n} | \lambda )\propto \lambda^{\sum_{i=1}^{n} X_i} (1-\lambda)^{n-\sum_{i=1}^{n} X_i}

To find the likelihood function, we need to consider the probability of the observed outcomes given the parameter λ.

Since the observations X₁, X₂, ..., Xₙ are modeled as Bernoulli random variables, indicating whether a head was tossed on each of the n tosses, we can express the likelihood function as follows:

Lₙ(X₁, X₂, ..., Xₙ | λ) ∝ λ^Σᵢ(Xᵢ) (1-λ)^(n-Σᵢ(Xᵢ))

Here, Σᵢ(Xᵢ) represents the sum of all the observed heads (Xᵢ = 1). Note that we use the symbol "∝" to denote proportionality since we are not considering the normalization constant.

Thus, the general expression for the likelihood function, in terms of λ, n, and Σᵢ(Xᵢ), is given by:

Lₙ(X₁, X₂, ..., Xₙ | λ) ∝ λ^Σᵢ(Xᵢ) (1-λ)^(n-Σᵢ(Xᵢ))