Convergence in distribution

Let Tn be a sequence of random variables that converges to N(0,1) in distribution.

Call Y = 2*Tn + 1

Compute

E[Y] =

Var[Y] =

Let Phi be the cumulative distribution function (cdf) of the standard Gaussian distribution. In terms of Phi, what is the limit, as n tends to infinity, of P(|Tn + 2| <= 8)?

P(|Tn + 2| <= 8) =

To compute E[Y], we can use the linearity of expectation:

E[Y] = E[2*Tn + 1]
= E[2*Tn] + E[1]
= 2*E[Tn] + 1.

Since Tn converges to N(0,1) in distribution, E[Tn] converges to 0. Therefore, the limit of E[Y] as n tends to infinity is 1.

To compute Var[Y], we can use the property that Var[aX + b] = a^2*Var[X] for any constants a and b:

Var[Y] = Var[2*Tn + 1]
= Var[2*Tn]
= 4*Var[Tn].

Since Tn converges to N(0,1) in distribution, Var[Tn] converges to 1. Therefore, the limit of Var[Y] as n tends to infinity is 4.

To compute P(|Tn + 2| <= 8), we can use the definition of convergence in distribution. We know that Tn converges to N(0,1) in distribution, which means that for any continuous bounded function g(x), we have:

lim n->∞ P(Tn ≤ x) = P(N(0,1) ≤ x).

In this case, we want to find the limit as n tends to infinity of P(|Tn + 2| ≤ 8). Let's denote this limit as P(|N(0,1) + 2| ≤ 8).

Since the standard Gaussian distribution is symmetric around 0, we can rewrite this probability as:

P(-8 ≤ N(0,1) + 2 ≤ 8)
= P(-10 ≤ N(0,1) ≤ 6).

Using the cumulative distribution function (cdf) of the standard Gaussian distribution, we can write this probability as:

P(-10 ≤ N(0,1) ≤ 6) = Φ(6) - Φ(-10).

Therefore, the limit of P(|Tn + 2| ≤ 8) as n tends to infinity is Φ(6) - Φ(-10).