A fair coin is flipped independently until the first Heads is observed. Let the random variable K be the number of tosses until the first Heads is observed plus 1. For example, if we see TTTHTH, then K=5 . For k=1,2,…,K , let Xk be a continuous random variable that is uniform over the interval [0,5] . The Xk are independent of one another and of the coin flips. Let X=∑Kk=1Xk . Find the mean and variance of X . You may use the fact that the mean and variance of a geometric random variable with parameter p are 1/p and (1−p)/p2 , respectively.

What is :

E[X] = ?

Var [X] = ?

To find the mean and variance of X, we need to consider the properties of the geometric distribution and the properties of the uniform distribution.

First, let's find the mean of X, denoted as E[X].

E[X] is the expected value of the sum of the Xk random variables. Since Xk follows a continuous uniform distribution over the interval [0,5], the mean of each Xk is (5-0)/2 = 2.5.

Since K is the number of tosses until the first Heads is observed plus 1, K follows a geometric distribution with parameter p = 1/2, as the coin is fair.

The mean of K is given by 1/p = 1/(1/2) = 2.

Now, by linearity of expectation, we have:

E[X] = E[∑Kk=1Xk] = ∑Kk=1E[Xk]

Since K follows a geometric distribution with parameter p = 1/2, we can rewrite the sum as:

E[X] = ∑∞k=1E[Xk] * P(K=k)

Since each Xk is independent of the coin flips, the probability P(K=k) can be written as:

P(K=k) = (1-p)^(k-1) * p

Substituting the values, we get:

E[X] = ∑∞k=1E[Xk] * (1/2)^(k-1) * (1/2)

Using the formula for the sum of an infinite geometric series, we have:

E[X] = E[Xk] * (1 - (1/2))^(-1) * (1/2)
= 2.5 * 2 * (1/2)
= 2.5 * 2
= 5

So, E[X] = 5.

Now, let's find the variance of X, denoted as Var[X].

Var[X] is equal to the sum of the variances of the Xk random variables.

Since each Xk follows a continuous uniform distribution over the interval [0,5], the variance of each Xk is equal to ((5-0)^2)/12 = 25/12.

Again, using linearity of variance, we have:

Var[X] = ∑∞k=1Var[Xk] * P(K=k)

Using the formula for the sum of an infinite geometric series, we have:

Var[X] = Var[Xk] * (1 - (1/2))^(-2) * (1/2)
= (25/12) * 4 * (1/2)
= 25/6

So, Var[X] = 25/6.

To find the mean and variance of X, we can use the properties of expected value and variance.

First, let's analyze the problem:

The number of tosses until the first Heads, K, follows a geometric distribution with parameter p = 1/2, since the coin is fair. We are given that K = k-1.

Since Xk is uniformly distributed over the interval [0,5], we can use the properties of uniform distribution.

Now let's start with finding the mean, E[X]:

To find the expected value of X, we can use the linearity of expectation. Since the sum of K random variables, Xk, forms X:

E[X] = E[∑Kk=1Xk]

We can rewrite this as:

E[X] = E[X1 + X2 + X3 + ... + XK]

Since the random variables Xk are independent of each other, the expected value of the sum is equal to the sum of the expected values:

E[X] = E[X1] + E[X2] + E[X3] + ... + E[XK]

Now, as the constant 5 is factored out of the expectation, note that each Xk has the same distribution:

E[X] = K * E[Xk]

Since E[Xk] is the expected value of a uniform random variable over [0,5], it is equal to (5-0)/2 = 2.5.

So, E[X] = K * 2.5 = (K-1) * 2.5

Next, let's find the variance, Var[X]:

To find the variance, we can use the following formula:

Var[X] = E[X^2] - (E[X])^2

To find E[X^2], we need to square and find the expected value of X:

E[X^2] = E[(X1 + X2 + X3 + ... + XK)^2]

Expanding the square, and using the fact that Xk is independent of other Xk:

E[X^2] = E[X1^2 + X2^2 + X3^2 + ... + XK^2 + 2∑(Xi*Xj)]

Since the random variables Xk are identically distributed, E[Xj*Xk] is the same for any j and k. Therefore, we can write it as c (a constant).

E[X^2] = K * E[Xk^2] + K(K-1) * c

E[Xk^2] is the expected value of a uniform random variable squared. The formula for E[Xk^2] is (5^2 - 0^2)/12 = 25/12.

So, E[X^2] = K * (25/12) + K(K-1) * c

Now, let's find c using the fact that the expected value of Xk*Xj is the same for any j and k:

c = E[Xk*Xj] = E[Xk] * E[Xj] = 2.5 * 2.5 = 6.25

Substituting this back into the equation for E[X^2]:

E[X^2] = K * (25/12) + K(K-1) * 6.25

Finally, let's compute the variance:

Var[X] = E[X^2] - (E[X])^2

Substituting the expressions we found earlier:

Var[X] = K * (25/12) + K(K-1) * 6.25 - ((K-1) * 2.5)^2

Simplifying further, we get:

Var[X] = K * (25/12) + K(K-1) * 6.25 - (K-1)^2 * 6.25

So, the mean and variance of X are:

E[X] = (K-1) * 2.5
Var[X] = K * (25/12) + K(K-1) * 6.25 - (K-1)^2 * 6.25

My probability theory is pretty rusty, but this seems like it might be a good place to start.

https://stats.stackexchange.com/questions/136806/expected-number-of-tosses-till-first-head-comes-up