Suppose that two random variables X1 and X2 have a bivariate normal distribution, and Var(X1) = Var(X2). Show that the sum X1+X2 and the difference X1− X2 are independent.

Solution:

Let μ1 and μ2 be the means of X1 and X2 respectively, and σ1 and σ2 be the standard deviations of X1 and X2 respectively. Since Var(X1) = Var(X2), we have σ1 = σ2.

The joint probability density function of X1 and X2 is given by:
f(x1,x2) = (1/2πσ1σ2)exp[−(x1−μ1)2/2σ12 − (x2−μ2)2/2σ22]

The probability density function of X1+X2 is given by:
f(x1+x2) = (1/2πσ1σ2)exp[−(x1+x2−μ1−μ2)2/2σ12 − (x1+x2−μ1−μ2)2/2σ22]

The probability density function of X1−X2 is given by:
f(x1−x2) = (1/2πσ1σ2)exp[−(x1−x2−μ1+μ2)2/2σ12 − (x1−x2−μ1+μ2)2/2σ22]

Since the probability density functions of X1+X2 and X1−X2 are independent of each other, X1+X2 and X1−X2 are independent.

To show that the sum X1 + X2 and the difference X1 - X2 are independent, we need to show that their covariance is zero.

Let's start by calculating the covariance between X1 and X2:

Cov(X1, X2) = Var(X1) + Var(X2) - 2Cov(X1, X2)

Since Var(X1) = Var(X2), we can simplify the equation:

Cov(X1, X2) = 2Var(X1) - 2Cov(X1, X2)

Now, let's calculate the covariance between X1 + X2 and X1 - X2:

Cov(X1 + X2, X1 - X2) = Cov(X1, X1) - Cov(X1, X2) + Cov(X2, X1) - Cov(X2, X2)

Simplifying the equation using the properties of covariance:

Cov(X1 + X2, X1 - X2) = Var(X1) - Cov(X1, X2) + Cov(X1, X2) - Var(X2)

Cov(X1 + X2, X1 - X2) = Var(X1) - Var(X2)

Since Var(X1) = Var(X2), we can further simplify the equation:

Cov(X1 + X2, X1 - X2) = 0

Therefore, the covariance between X1 + X2 and X1 - X2 is zero, which implies that they are independent.

To show that the sum X1 + X2 and the difference X1 − X2 are independent, we need to demonstrate that their joint probability distribution factors into the product of their marginal distributions.

Let's start by expressing X1 and X2 in terms of their sum and difference:

X1 = (X1 + X2)/2 + (X1 − X2)/2
X2 = (X1 + X2)/2 - (X1 − X2)/2

Next, let's define two new random variables Y1 = (X1 + X2)/2 and Y2 = (X1 − X2)/2.

By rearranging the equations above, we can solve for X1 and X2:

X1 = Y1 + Y2
X2 = Y1 - Y2

Now, we need to find the joint probability distribution of Y1 and Y2. Since X1 and X2 have a bivariate normal distribution, it follows that Y1 and Y2 will also have a bivariate normal distribution.

Let's assume that the joint probability distribution of Y1 and Y2 is given by f(Y1, Y2).

To prove independence, we need to show that the joint probability distribution f(Y1, Y2) can be expressed as the product of their marginal distributions.

The marginal distribution of Y1 can be obtained by integrating f(Y1, Y2) with respect to Y2, and the marginal distribution of Y2 can be obtained by integrating f(Y1, Y2) with respect to Y1.

Let g1(Y1) represent the marginal distribution of Y1, and g2(Y2) represent the marginal distribution of Y2.

To show independence, we need to demonstrate that f(Y1, Y2) = g1(Y1) * g2(Y2).

To do this, we can use joint probability density transformation.

First, let's calculate the Jacobian determinant of the transformation:

J = ∂(X1, X2) / ∂(Y1, Y2)

Since X1 and X2 are expressed as linear combinations of Y1 and Y2, the determinant of the Jacobian becomes:

J = 1 / 2

Now, let's express the joint probability density function f(X1, X2) in terms of f(Y1, Y2) using the Jacobian determinant:

f(X1, X2) = f(Y1 + Y2, Y1 - Y2) * |J|

Since Var(X1) = Var(X2), the random variables X1 and X2 are uncorrelated, implying that their joint probability density function f(X1, X2) can be written as the product of their marginal distributions:

f(X1, X2) = f(X1) * f(X2)

Now, substituting X1 = Y1 + Y2 and X2 = Y1 - Y2, we get:

f(Y1 + Y2, Y1 - Y2) = f(Y1 + Y2) * f(Y1 - Y2)

Finally, substituting this expression back into the equation above, and applying the Jacobian determinant, we have:

f(X1) * f(X2) = f(Y1) * f(Y2)

Therefore, we have shown that the sum X1 + X2 and the difference X1 − X2 are independent.