Let \widehat\Theta be an estimator of a random variable \Theta, and let \widetilde\Theta =\widehat\Theta -\Theta be the estimation error.

a) In this part of the problem, let \widehat\Theta be specifically the LMS estimator of \Theta. We have seen that for the case of the LMS estimator, {\bf E}[\widetilde\Theta \mid X=x]=0 for every x. Is it also true that {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 for all \theta? Equivalently, is it true that {\bf E}[\widehat\Theta \mid \Theta =\theta ]=\theta for all \theta?

Select an option
unanswered
b) In this part of the problem, \widehat\Theta is no longer necessarily the LMS estimator of \Theta. Is the property \textsf{Var}(\Theta )=\textsf{Var}(\widehat\Theta )+\textsf{Var}(\widetilde\Theta ) true for every estimator \widehat\Theta?

a) The statement {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 for all \theta is not necessarily true. This would imply that the estimator is unbiased, meaning that the expected value of the estimator is equal to the true value of the parameter being estimated. However, it is not guaranteed that the LMS estimator is unbiased for all values of \theta. Therefore, {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 is not always true.

b) The property \textsf{Var}(\Theta )=\textsf{Var}(\widehat\Theta )+\textsf{Var}(\widetilde\Theta ) is not true for every estimator \widehat\Theta. This property is specifically true for the LMS estimator, but may not hold for other estimators. The LMS estimator is designed to minimize the mean squared error, which leads to this property. However, for other estimators, such as maximum likelihood estimators, this property may not hold.

a) In the case of the LMS estimator, we know that the conditional expectation of the estimation error, given the observed value X, is zero, i.e., E[𝜃̃ | X=x] = 0 for every x. However, this does not necessarily imply that E[𝜃̃ | 𝜃=𝜃] = 0 for all 𝜃. So, it is not always true that E[𝜃̃ | 𝜃=𝜃] = 0 for all 𝜃, or equivalently, E[𝜃̂ | 𝜃=𝜃] = 𝜃 for all 𝜃.

b) The property Var(𝜃) = Var(𝜃̂) + Var(𝜃̃) does not hold for every estimator 𝜃̂. This property holds true for unbiased estimators, but not necessarily for biased estimators. Unbiased estimators have the property that their expected value is equal to the true value of the parameter being estimated, i.e., E[𝜃̂] = 𝜃. In the case of biased estimators, this property does not hold, and therefore, the variance of the estimator may not equal the sum of the variances of the estimator and the estimation error.