The lifetime (in thousands of hours) X of a light bulb has pdf

g(x)= \lambda e^{-\lambda x}, \hspace{3mm} x\geq 0

for some unknown \lambda >0.

Find the asymptotic variance \displaystyle V_{\hat{\lambda }^{\text {MLE}}} of \hat{\lambda }^{\text {MLE}}.

V_{\hat{\lambda }^{\text {MLE}}}=\quad

To find the asymptotic variance of the maximum likelihood estimator (MLE) of lambda, we need to use the Fisher Information.

The Fisher Information (I(lambda)) in this case is given by:

I(lambda) = -E(d^2(log(g(X)))/d(lambda)^2)

We can calculate the second derivative of the log-likelihood function:

d^2(log(g(X)))/d(lambda)^2 = -x^2 e^(-lambda x)

Taking the expectation of the second derivative, we have:

E(d^2(log(g(X)))/d(lambda)^2) = -integrate(x^2 e^(-lambda x) g(x) dx, 0, infinity)
= -integrate(x^2 lambda e^(-lambda x) dx, 0, infinity)
= -2/lambda^3

Therefore, the Fisher Information is:

I(lambda) = 2/lambda^3

The asymptotic variance of the MLE of lambda is the inverse of the Fisher Information:

V_{\hat{lambda}^{MLE}} = 1 / I(lambda)
= lambda^3 / 2

So, the asymptotic variance of the MLE of lambda is lambda^3 / 2.

To find the asymptotic variance of the maximum likelihood estimator (MLE) \(\hat{\lambda}\), we need to evaluate the Fisher Information.

The Fisher Information (\(I(\lambda)\)) is defined as the expected value of the observed Fisher information (\(I(\lambda; X)\)). The observed Fisher information is the negative second derivative of the log-likelihood function with respect to \(\lambda\).

Given the probability density function (pdf) \(g(x) = \lambda e^{-\lambda x}\), the log-likelihood function is:

\[
\begin{aligned}
\ell(\lambda) &= \sum_{i=1}^{n}{\log{\lambda} - \lambda x_i} \\
&= n\log{\lambda} - \lambda\sum_{i=1}^{n}{x_i}
\end{aligned}
\]

Taking the first derivative with respect to \(\lambda\), we have:

\[
\frac{d\ell(\lambda)}{d\lambda} = \frac{n}{\lambda} - \sum_{i=1}^{n}{x_i}
\]

Setting the derivative equal to zero and solving for \(\lambda\), we get the maximum likelihood estimate:

\[
\hat{\lambda}_{\text{MLE}} = \frac{n}{\sum_{i=1}^{n}{x_i}}
\]

To find the second derivative of the log-likelihood function with respect to \(\lambda\), we differentiate again:

\[
\frac{d^2\ell(\lambda)}{d\lambda^2} = -\frac{n}{\lambda^2}
\]

Taking the negative of the second derivative, we obtain the observed Fisher information:

\[
I(\lambda; X) = \frac{n}{\lambda^2}
\]

To find the expected Fisher information, we integrate the observed Fisher information multiplied by the pdf of the lifetime variable \(X\):

\[
I(\lambda) = \int_{0}^{\infty}{\frac{n}{\lambda^2} \cdot \lambda e^{-\lambda x} \, dx}
\]

Simplifying the integral, we get:

\[
I(\lambda) = \frac{n}{\lambda^2} \int_{0}^{\infty}{e^{-\lambda x} \, dx}
\]

The integral \(\int_{0}^{\infty}{e^{-\lambda x} \, dx}\) is the Laplace transform of \(e^{-\lambda x}\), which evaluates to \(\frac{1}{\lambda}\).

Substituting this result, we have:

\[
I(\lambda) = \frac{n}{\lambda^2} \cdot \frac{1}{\lambda} = \frac{n}{\lambda^3}
\]

Finally, the asymptotic variance of the maximum likelihood estimator is the inverse of the Fisher Information:

\[
V_{\hat{\lambda}_{\text{MLE}}} = \frac{1}{I(\lambda)} = \frac{\lambda^3}{n}
\]

So, the asymptotic variance of \(\hat{\lambda}_{\text{MLE}}\) is \(\frac{\lambda^3}{n}\).