Argue that the proposed estimators πœ†Λ† and πœ†Λœ below are both consistent and asymptotically normal. Then, give their asymptotic variances 𝑉(πœ†Λ†) and 𝑉(πœ†Λœ) , and decide if one of them is always bigger than the other.

Let 𝑋1,…,π‘‹π‘›βˆΌπ‘–.𝑖.𝑑.π–―π—ˆπ—‚π—Œπ—Œ(πœ†) , for some πœ†>0 . Let πœ†Μ‚ =π‘‹βŽ―βŽ―βŽ―βŽ―βŽ―π‘› and πœ†Μƒ =βˆ’ln(π‘ŒβŽ―βŽ―βŽ―βŽ―π‘›) , where π‘Œπ‘–=1{𝑋𝑖=0},𝑖=1,…,𝑛 .

𝑉(πœ†Λ†) =? and 𝑉(πœ†Λœ) =?

----------------------------------------------------------------------------------------------------

As above, argue that both proposed estimators πœ†Λ† and πœ†Λœ are consistent and asymptotically normal. Then, give their asymptotic variances 𝑉(πœ†Λ†) and 𝑉(πœ†Λœ), and decide if one of them is always bigger than the other.

Let 𝑋1,…,π‘‹π‘›βˆΌπ‘–.𝑖.𝑑.𝖀𝗑𝗉(πœ†) , for some πœ†>0 . Let πœ†Λ†=1π‘‹βŽ―βŽ―βŽ―βŽ―βŽ―π‘› and πœ†Μƒ =βˆ’ln(π‘ŒβŽ―βŽ―βŽ―βŽ―π‘›) , where π‘Œπ‘–=1{𝑋𝑖>1},𝑖=1,…,𝑛 .

𝑉(πœ†Λ†) = ? and 𝑉(πœ†Λœ)=?

----------------------------------------------------------------------------------------------------
As above, argue that both proposed estimators 𝑝ˆ,π‘Λœ, are consistent and asymptotically normal. Then, give their asymptotic variances 𝑉(𝑝ˆ) and 𝑉(π‘Λœ) and decide if one of them is always bigger than the other.

Let 𝑋1,…,π‘‹π‘›βˆΌπ‘–.𝑖.𝑑.π–¦π–Ύπ—ˆπ—†(𝑝) , for some π‘βˆˆ(0,1) . That means that

𝐏(𝑋1=π‘˜)=𝑝(1βˆ’π‘)π‘˜βˆ’1,for π‘˜=1,2,….

Let

𝑝ˆ = 1π‘‹βŽ―βŽ―βŽ―βŽ―βŽ―π‘›,
and π‘Λœ be the number of ones in the sample divided by 𝑛 .

𝑉(𝑝ˆ)=? and 𝑉(π‘Λœ)=?

The proposed estimators 𝑝ˆ and π‘Λœ are both consistent and asymptotically normal. Their asymptotic variances are 𝑉(𝑝ˆ)=𝑝(1βˆ’π‘) and 𝑉(π‘Λœ)=𝑝(1βˆ’π‘)/𝑛 , respectively. The variance of 𝑝ˆ is always bigger than the variance of π‘Λœ .

To prove that the proposed estimators πœ†Μ‚ and πœ†Μƒ are both consistent and asymptotically normal, we need to verify the following conditions:

1. Consistency: The estimators converge in probability to the true parameter πœ† as the sample size increases.

2. Asymptotic Normality: The estimators have asymptotic normal distributions with mean πœ† and variance V(πœ†), regardless of the true parameter value.

Now, let's analyze each estimator separately:

1. Estimator πœ†Μ‚:
- Consistency: The estimator πœ†Μ‚ = π‘‹βŽ―βŽ―βŽ―βŽ―βŽ―π‘› is the sample mean, which is a well-known consistent estimator. It converges in probability to πœ† because of the Law of Large Numbers.
- Asymptotic Normality: Since πœ†Μ‚ is a sample mean, by the Central Limit Theorem, it follows an asymptotic normal distribution with mean πœ† and variance V(πœ†Μ‚) = πœ†/𝑛.

2. Estimator πœ†Μƒ:
- Consistency: The estimator πœ†Μƒ = -ln(π‘ŒβŽ―βŽ―βŽ―βŽ―π‘›) is the negative logarithm of the sample mean of the indicator variable π‘Œπ‘– = 1{𝑋𝑖 = 0}. As π‘‹π‘–βˆΌπ‘–.𝑖.𝑑.π–―π—ˆπ—‚π—Œπ—Œ(πœ†), the probability of observing 0 is (1-πœ†). Therefore, π‘Œπ‘– = 1{𝑋𝑖 = 0} follows a Bernoulli distribution with parameter (1-πœ†). By the Law of Large Numbers, the sample mean π‘ŒβŽ―βŽ―βŽ―βŽ―π‘› converges in probability to (1-πœ†). Taking the negative logarithm, -ln(π‘ŒβŽ―βŽ―βŽ―βŽ―π‘›) converges to ln(1/(1-πœ†)), which is πœ†. Hence, πœ†Μƒ is a consistent estimator for πœ†.
- Asymptotic Normality: We can use the Central Limit Theorem for Bernoulli random variables to show that -ln(π‘ŒβŽ―βŽ―βŽ―βŽ―π‘›) follows an asymptotic normal distribution with mean πœ† and variance V(πœ†Μƒ) = 1/((1-πœ†)𝑛).

Now, let's determine the asymptotic variances 𝑉(πœ†Μ‚) and 𝑉(πœ†Μƒ):

1. Asymptotic Variance of πœ†Μ‚ (V(πœ†Μ‚)):
From the above analysis, we have V(πœ†Μ‚) = πœ†/𝑛.

2. Asymptotic Variance of πœ†Μƒ (V(πœ†Μƒ)):
The expression for V(πœ†Μƒ) is given as 1/((1-πœ†)𝑛).

Finally, we need to compare the asymptotic variances 𝑉(πœ†Μ‚) and 𝑉(πœ†Μƒ). Depending on the value of πœ† and n, one of the variances may be larger than the other. To determine which is bigger, we need specific values for πœ† and n. Without these specific values, we cannot determine the comparative magnitudes of 𝑉(πœ†Μ‚) and 𝑉(πœ†Μƒ).

To determine whether the proposed estimators are consistent and asymptotically normal, we need to analyze their properties as the sample size (𝑛) approaches infinity. We will also find their asymptotic variances (𝑉(πœ†Λ†) and 𝑉(πœ†Μƒ)).

1. πœ†Μ‚ = 𝑋̄𝑛:
The estimator πœ†Μ‚ is the sample mean (𝑋̄) of the 𝑛 observations. For an 𝑖.𝑖.𝑑. Poisson distribution, the sample mean is an unbiased and consistent estimator of the population mean (πœ†).

- Consistency: The sample mean is consistent because as 𝑛 approaches infinity, the Law of Large Numbers ensures that the sample mean converges to the true population mean.
- Asymptotic Normality: Since the Poisson distribution is a member of the exponential family of distributions, and under regularity conditions, the sample mean is asymptotically normally distributed. Therefore, πœ†Μ‚ is also asymptotically normal.

2. πœ†Μƒ = -ln(π‘ŒΜ„π‘›):
The estimator πœ†Μƒ is the negative natural logarithm of the sample proportion of zeros (π‘ŒΜ„) in the 𝑛 observations. π‘Œπ‘– is an indicator function that is 1 when 𝑋𝑖 = 0 and 0 otherwise.

- Consistency: As 𝑛 approaches infinity, the Law of Large Numbers ensures that π‘ŒΜ„ converges to the true proportion of zeros. Since the logarithm is a continuous function, taking the logarithm of the converging π‘ŒΜ„ will still converge to the logarithm of the true proportion of zeros.
- Asymptotic Normality: By applying the Central Limit Theorem for the sample proportion, π‘ŒΜ„ is asymptotically normal. Therefore, taking the logarithm of the asymptotically normal π‘ŒΜ„ will still result in an asymptotically normal estimator, πœ†Μƒ.

Now, let's compute the asymptotic variances of πœ†Μ‚ and πœ†Μƒ.

1. 𝑉(πœ†Μ‚):
The asymptotic variance of πœ†Μ‚ can be determined using the formula for the variance of the sample mean for an 𝑖.𝑖.𝑑. Poisson distribution:
𝑉(πœ†Μ‚) = πœ† / 𝑛

2. 𝑉(πœ†Μƒ):
The asymptotic variance of πœ†Μƒ can be computed using the Delta Method. Since 𝑉(π‘ŒΜ„π‘›) = (𝑝(1 - 𝑝)) / 𝑛 for a Bernoulli distribution, applying the Delta Method gives:
𝑉(πœ†Μƒ) = (1 / πœ†πœ‹Β²) * (𝑝 / (1 - 𝑝))

Comparing the two variances, it is not always possible to determine which one is bigger without knowledge of the specific values of πœ† and 𝑝. The relative magnitudes of the variances will depend on the particular parameters πœ† and 𝑝.

In summary:
- πœ†Μ‚ and πœ†Μƒ are both consistent and asymptotically normal estimators.
- 𝑉(πœ†Μ‚) = πœ† / 𝑛 is the asymptotic variance for πœ†Μ‚.
- 𝑉(πœ†Μƒ) = (1 / πœ†πœ‹Β²) * (𝑝 / (1 - 𝑝)) is the asymptotic variance for πœ†Μƒ.
- The comparison between 𝑉(πœ†Μ‚) and 𝑉(πœ†Μƒ) depends on the specific values of πœ† and 𝑝.