Argue that the proposed estimators πΛ and πΛ below are both consistent and asymptotically normal. Then, give their asymptotic variances π(πΛ) and π(πΛ) , and decide if one of them is always bigger than the other.
Let π1,β¦,ππβΌπ.π.π.π―ππππ(π) , for some π>0 . Let πΜ =πβ―β―β―β―β―π and πΜ =βln(πβ―β―β―β―π) , where ππ=1{ππ=0},π=1,β¦,π .
π(πΛ) =? and π(πΛ) =?
----------------------------------------------------------------------------------------------------
As above, argue that both proposed estimators πΛ and πΛ are consistent and asymptotically normal. Then, give their asymptotic variances π(πΛ) and π(πΛ), and decide if one of them is always bigger than the other.
Let π1,β¦,ππβΌπ.π.π.π€ππ(π) , for some π>0 . Let πΛ=1πβ―β―β―β―β―π and πΜ =βln(πβ―β―β―β―π) , where ππ=1{ππ>1},π=1,β¦,π .
π(πΛ) = ? and π(πΛ)=?
----------------------------------------------------------------------------------------------------
As above, argue that both proposed estimators πΛ,πΛ, are consistent and asymptotically normal. Then, give their asymptotic variances π(πΛ) and π(πΛ) and decide if one of them is always bigger than the other.
Let π1,β¦,ππβΌπ.π.π.π¦πΎππ(π) , for some πβ(0,1) . That means that
π(π1=π)=π(1βπ)πβ1,for π=1,2,β¦.
Let
πΛ = 1πβ―β―β―β―β―π,
and πΛ be the number of ones in the sample divided by π .
π(πΛ)=? and π(πΛ)=?
The proposed estimators πΛ and πΛ are both consistent and asymptotically normal. Their asymptotic variances are π(πΛ)=π(1βπ) and π(πΛ)=π(1βπ)/π , respectively. The variance of πΛ is always bigger than the variance of πΛ .
To prove that the proposed estimators πΜ and πΜ are both consistent and asymptotically normal, we need to verify the following conditions:
1. Consistency: The estimators converge in probability to the true parameter π as the sample size increases.
2. Asymptotic Normality: The estimators have asymptotic normal distributions with mean π and variance V(π), regardless of the true parameter value.
Now, let's analyze each estimator separately:
1. Estimator πΜ:
- Consistency: The estimator πΜ = πβ―β―β―β―β―π is the sample mean, which is a well-known consistent estimator. It converges in probability to π because of the Law of Large Numbers.
- Asymptotic Normality: Since πΜ is a sample mean, by the Central Limit Theorem, it follows an asymptotic normal distribution with mean π and variance V(πΜ) = π/π.
2. Estimator πΜ:
- Consistency: The estimator πΜ = -ln(πβ―β―β―β―π) is the negative logarithm of the sample mean of the indicator variable ππ = 1{ππ = 0}. As ππβΌπ.π.π.π―ππππ(π), the probability of observing 0 is (1-π). Therefore, ππ = 1{ππ = 0} follows a Bernoulli distribution with parameter (1-π). By the Law of Large Numbers, the sample mean πβ―β―β―β―π converges in probability to (1-π). Taking the negative logarithm, -ln(πβ―β―β―β―π) converges to ln(1/(1-π)), which is π. Hence, πΜ is a consistent estimator for π.
- Asymptotic Normality: We can use the Central Limit Theorem for Bernoulli random variables to show that -ln(πβ―β―β―β―π) follows an asymptotic normal distribution with mean π and variance V(πΜ) = 1/((1-π)π).
Now, let's determine the asymptotic variances π(πΜ) and π(πΜ):
1. Asymptotic Variance of πΜ (V(πΜ)):
From the above analysis, we have V(πΜ) = π/π.
2. Asymptotic Variance of πΜ (V(πΜ)):
The expression for V(πΜ) is given as 1/((1-π)π).
Finally, we need to compare the asymptotic variances π(πΜ) and π(πΜ). Depending on the value of π and n, one of the variances may be larger than the other. To determine which is bigger, we need specific values for π and n. Without these specific values, we cannot determine the comparative magnitudes of π(πΜ) and π(πΜ).
To determine whether the proposed estimators are consistent and asymptotically normal, we need to analyze their properties as the sample size (π) approaches infinity. We will also find their asymptotic variances (π(πΛ) and π(πΜ)).
1. πΜ = πΜπ:
The estimator πΜ is the sample mean (πΜ) of the π observations. For an π.π.π. Poisson distribution, the sample mean is an unbiased and consistent estimator of the population mean (π).
- Consistency: The sample mean is consistent because as π approaches infinity, the Law of Large Numbers ensures that the sample mean converges to the true population mean.
- Asymptotic Normality: Since the Poisson distribution is a member of the exponential family of distributions, and under regularity conditions, the sample mean is asymptotically normally distributed. Therefore, πΜ is also asymptotically normal.
2. πΜ = -ln(πΜπ):
The estimator πΜ is the negative natural logarithm of the sample proportion of zeros (πΜ) in the π observations. ππ is an indicator function that is 1 when ππ = 0 and 0 otherwise.
- Consistency: As π approaches infinity, the Law of Large Numbers ensures that πΜ converges to the true proportion of zeros. Since the logarithm is a continuous function, taking the logarithm of the converging πΜ will still converge to the logarithm of the true proportion of zeros.
- Asymptotic Normality: By applying the Central Limit Theorem for the sample proportion, πΜ is asymptotically normal. Therefore, taking the logarithm of the asymptotically normal πΜ will still result in an asymptotically normal estimator, πΜ.
Now, let's compute the asymptotic variances of πΜ and πΜ.
1. π(πΜ):
The asymptotic variance of πΜ can be determined using the formula for the variance of the sample mean for an π.π.π. Poisson distribution:
π(πΜ) = π / π
2. π(πΜ):
The asymptotic variance of πΜ can be computed using the Delta Method. Since π(πΜπ) = (π(1 - π)) / π for a Bernoulli distribution, applying the Delta Method gives:
π(πΜ) = (1 / ππΒ²) * (π / (1 - π))
Comparing the two variances, it is not always possible to determine which one is bigger without knowledge of the specific values of π and π. The relative magnitudes of the variances will depend on the particular parameters π and π.
In summary:
- πΜ and πΜ are both consistent and asymptotically normal estimators.
- π(πΜ) = π / π is the asymptotic variance for πΜ.
- π(πΜ) = (1 / ππΒ²) * (π / (1 - π)) is the asymptotic variance for πΜ.
- The comparison between π(πΜ) and π(πΜ) depends on the specific values of π and π.