The lifetime (in thousands of hours) X of a light bulb has pdf

g(x)= \lambda e^{-\lambda x}, \hspace{3mm} x\geq 0

for some unknown \lambda >0.

We collect {\color{blue}{n=33}} independent lightbulbs at random and record their lifetime X_1,\ldots ,X_ n, which are all independent copies of X. We find that {\color{blue}{\overline{X_ n}=42.6}} thousand hours.

Now, consider the prior distribution \lambda \sim \mathsf{Exp}(\theta ), for some fixed parameter \theta >0.

Compute the posterior pdf \displaystyle \pi (\lambda |X_1,\ldots , X_ n) of \lambda up to normalizing constant. Enter your answer in terms of \lambda, \theta, n and S_ n=\sum _{i=1}^{n} X_ i.

(Enter S_n for S_ n=\sum _{i=1}^{n} X_ i. )

(Any answer correct up to a normalizing constant will be accepted.)

\pi (\lambda |X_1,\ldots , X_ n)=\quad

Compute the maximum a posteriori estimator \hat{\lambda }^{\text {MAP}} of \lambda.

Enter your answer in terms of \theta, n and S_ n=\sum _{i=1}^{n} X_ i.

(Enter S_n for S_ n=\sum _{i=1}^{n} X_ i. )

\hat{\lambda }^{\text {MAP}}=\quad

To find the posterior pdf of λ, we can use Bayes' theorem:

π(λ|X_1,...,X_n) = f(X_1,...,X_n|λ) * π(λ)

where f(X_1,...,X_n|λ) is the likelihood function and π(λ) is the prior distribution.

For exponential distributions, the likelihood function is given by multiplying the individual pdfs:

f(X_1,...,X_n|λ) = λ^n * e^(-λ∑X_i)

The prior distribution is given as λ ∼ Exp(θ), which means its pdf is:

π(λ) = θ * e^(-θλ)

Now we can plug in the values and simplify:

π(λ|X_1,...,X_n) = λ^n * e^(-λ∑X_i) * θ * e^(-θλ)

= λ^n * θ * e^(-λ(∑X_i + θ))

The maximum a posteriori (MAP) estimator of λ is the value of λ that maximizes the posterior pdf. In this case, we can see that the posterior pdf is maximized when the exponent (-λ(∑X_i + θ)) is minimized. Since ∑X_i is a fixed value (42.6), the MAP estimator can be found by minimizing the exponent with respect to λ. Taking the derivative with respect to λ and setting it to zero, we have:

-∑X_i - θ = 0

Simplifying, we find:

-∑X_i = θ

Therefore, the maximum a posteriori estimator of λ is:

θ - S_n

where S_n is the sum of the observed lifetimes, ∑X_i.

To compute the posterior pdf π(λ|X₁, ..., Xₙ) of λ up to a normalizing constant, we need to use Bayes' theorem. Bayes' theorem states that:

π(λ|X₁, ..., Xₙ) = (f(X₁, ..., Xₙ|λ) * π(λ))/∫[f(X₁, ..., Xₙ|λ) * π(λ)] dλ

Where f(X₁, ..., Xₙ|λ) is the likelihood function, and π(λ) is the prior distribution.

In this case, we have:

f(X₁, ..., Xₙ|λ) = ∏[g(Xᵢ)] = ∏[λ * e^(-λ * Xᵢ)] = λⁿ * e^(-λ * (X₁ + ... + Xₙ))

π(λ) = Exp(θ) = θ * e^(-θ * λ)

Substituting these values into Bayes' theorem, we have:

π(λ|X₁, ..., Xₙ) = (λⁿ * e^(-λ * (X₁ + ... + Xₙ)) * θ * e^(-θ * λ))/∫[(λⁿ * e^(-λ * (X₁ + ... + Xₙ)) * θ * e^(-θ * λ))] dλ

To compute the maximum a posteriori (MAP) estimator of λ, we need to find the value of λ that maximizes the posterior pdf, which is the same as finding the value of λ that maximizes the numerator of the above expression.

Taking the logarithm of the numerator to simplify the computation, we have:

log(λⁿ * e^(-λ * (X₁ + ... + Xₙ)) * θ * e^(-θ * λ)) = log(λⁿ) - λ * (X₁ + ... + Xₙ) + log(θ) - θ * λ

To find the maximum, we take the derivative with respect to λ, set it equal to zero, and solve for λ:

(d/dλ)[log(λⁿ) - λ * (X₁ + ... + Xₙ) + log(θ) - θ * λ] = (n/λ) - (X₁ + ... + Xₙ) - θ = 0

Solving for λ, we have:

λ = n/(X₁ + ... + Xₙ + θ)

Therefore, the maximum a posteriori (MAP) estimator of λ is:

𝜆̂^MAP = n/(Sₙ + θ)

where Sₙ = X₁ + ... + Xₙ.