For all problems on this page, suppose you have data

X_1,\ldots ,X_ n \overset {\text {i.i.d.}}{\sim } \mathcal{N}(0,1)

that is a random sample of identically and independently distributed standard normal random variables.

Useful facts:

The following facts might be useful: For a standard normal random variable X_1, we have:

\mathbb E[X_1] = 0, \quad \mathbb E[X^2_1] = 1, \quad \mathbb E[X^4_1] = 3.

Consider the sample mean:

\overline{X}_ n = \frac{1}{n}\left(X_1 + X_2 + ... + X_ n\right).

What are the mean \mathbb E[\overline{X}_ n] and variance \textsf{Var}[\overline{X}_ n] of \overline{X}_ n?

The sample mean \(\overline{X}_n\) is defined as the average of the random variables \(X_1, X_2, ..., X_n\). Since each \(X_i\) is an i.i.d. standard normal random variable, we can calculate the mean and variance of \(\overline{X}_n\) using properties of the normal distribution.

Mean of \(\overline{X}_n\):
The mean of \(\overline{X}_n\) is equal to the mean of the individual random variables \(X_i\), which is 0. This can be seen from the linearity of expectations:

\(\mathbb{E}[\overline{X}_n] = \frac{1}{n}\left(\mathbb{E}[X_1] + \mathbb{E}[X_2] + ... + \mathbb{E}[X_n]\right) = \frac{1}{n} \cdot n \cdot \mathbb{E}[X_1] = \mathbb{E}[X_1] = 0\).

Variance of \(\overline{X}_n\):
The variance of \(\overline{X}_n\) can be calculated using the fact that the variance of the sum of independent random variables is equal to the sum of their variances. Since each \(X_i\) is i.i.d. with variance 1, we have:

\(\text{Var}[\overline{X}_n] = \text{Var}\left(\frac{1}{n}\left(X_1 + X_2 + ... + X_n\right)\right) = \frac{1}{n^2}\left(\text{Var}[X_1] + \text{Var}[X_2] + ... + \text{Var}[X_n]\right) = \frac{1}{n^2} \cdot n \cdot \text{Var}[X_1] = \frac{\text{Var}[X_1]}{n} = \frac{1}{n}\).

Therefore, the mean of \(\overline{X}_n\) is 0 and the variance of \(\overline{X}_n\) is \(\frac{1}{n}\).

What kind of distribution does \overline{X}_ n follow?

The sample mean \(\overline{X}_n\) follows a normal distribution. According to the Central Limit Theorem, for a large enough sample size \(n\), the distribution of the sample mean approaches a normal distribution, regardless of the distribution of the individual random variables \(X_i\).

Specifically, in this case, since each \(X_i\) is an i.i.d. standard normal random variable, the distribution of \(\overline{X}_n\) will be approximately normal with mean 0 and variance \(\frac{1}{n}\).

Mathematically, we can express this as:

\(\overline{X}_n \sim \mathcal{N}\left(0, \frac{1}{n}\right)\)

This means that as the sample size increases, the distribution of \(\overline{X}_n\) becomes increasingly concentrated around the mean of 0, with a decreasing variance.

options are :

Gaussian

Student t

Chi square

Gamma

nonparametric

The correct answer is Gaussian (also known as normal) distribution.

Consider the quantile Q_{{\color{blue}{n}} ,\alpha } of order 1-\alpha for the random variable \overline{X}_{{\color{blue}{n}} }, that is, the number Q_{n,\alpha } such that

\displaystyle \displaystyle \mathbf{P}\Big(\overline{X}_ n \le Q_{n,\alpha } \Big) = {\color{blue}{1-\alpha }} , \qquad 0 < \alpha < 1 .
(Be careful that the probability in the definition above is 1-\alpha, not \alpha.)

For \alpha <0.5,\,, as the sample size n increases, does the quantile Q_{n,\alpha } decrease, increase, stays the same, oscillates?

For \(\alpha < 0.5\), as the sample size \(n\) increases, the quantile \(Q_{n,\alpha}\) decreases.

The quantile \(Q_{n,\alpha}\) represents the value below which a fraction of \(1-\alpha\) of the data falls. As \(n\) increases, the sample mean \(\overline{X}_n\) becomes a more accurate estimator of the population mean, resulting in a narrower distribution and lower variability. This leads to smaller values of \(\overline{X}_n\) and, consequently, a smaller quantile \(Q_{n,\alpha}\).

In other words, as the sample size increases, more data is available to estimate the population mean, reducing the uncertainty and resulting in a smaller quantile that represents a higher level of confidence.