Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is 1 if a < 0.
CORRECTION:
Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is -1 if a < 0.
cor(X,Y) = cov(X,V) / sqrt(var(X)*� var(Y))
Anyone?
To show that the correlation of X and Y is 1 when a < 0, we need to calculate the correlation coefficient between the two variables. The correlation coefficient, denoted by ρ (rho), measures the strength and direction of the linear relationship between two variables.
Let's start by calculating the covariance between X and Y. The covariance, denoted by Cov(X, Y), measures how much two variables vary together. In this case, we have:
Cov(X, Y) = Cov(X, aX + b)
The covariance can be calculated using the following formula:
Cov(X, Y) = E[(X - μX)(Y - μY)]
where E is the expectation or average value operator, and μX and μY represent the means of X and Y, respectively.
Since Y = aX + b, we can substitute this in the covariance formula:
Cov(X, Y) = E[(X - μX)(aX + b - μY)]
Expanding the expression:
Cov(X, Y) = E[(aX^2 + bX - aμX - bμY)]
Using linearity of expectation, we can split the equation:
Cov(X, Y) = aE[X^2] + bE[X] - aμX - bμY
Now, we need to calculate the variances of X and Y. The variances, Var(X) and Var(Y), represent the square of the standard deviations of the respective variables. The variance can be calculated using the following formula:
Var(Z) = E[(Z - μZ)^2]
Applying this formula to X and Y, we have:
Var(X) = E[(X - μX)^2]
Var(Y) = E[(Y - μY)^2]
Substituting Y = aX + b:
Var(X) = E[(X - μX)^2]
Var(Y) = E[(aX + b - μY)^2]
Expanding the expressions:
Var(X) = E[X^2 - 2μX X + μX^2]
Var(Y) = E[(aX + b)^2 - 2(aX + b)μY + μY^2]
Simplifying further:
Var(X) = E[X^2] - 2μX E[X] + μX^2
Var(Y) = E[a^2X^2 + 2abX + b^2] - 2(aE[X] + b)μY + μY^2
Now, let's calculate the correlation coefficient (ρ) using the formula:
ρ = Cov(X, Y) / [√(Var(X)) √(Var(Y))]
Substituting the previously calculated values:
ρ = (aE[X^2] + bE[X] - aμX - bμY) / [√(E[X^2] - 2μX E[X] + μX^2) √(E[a^2X^2 + 2abX + b^2] - 2(aE[X] + b)μY + μY^2)]
Now, consider the case where a < 0. In this case, since the quadratic term (a^2X^2) in the denominator is positive, and we are considering the expected value, we have:
E[a^2X^2 + 2abX + b^2] > 0
Additionally, by assuming X has a non-zero variance (Var(X) > 0), we also have:
E[X^2] > 0
Since the numerator (Cov(X, Y)) is of the same sign as a, the correlation coefficient becomes:
ρ = (|a|E[X^2] + bE[X] - aμX - bμY) / [√(E[X^2] - 2μX E[X] + μX^2) √(E[a^2X^2 + 2abX + b^2] - 2(aE[X] + b)μY + μY^2)]
Now, notice that when a < 0, the term |a| reduces to -a. Additionally, since a is negative, the terms (-aμX) and (-bμY) will also have the same sign and cancel out the positive terms. This results in:
ρ = (-aE[X^2] - bE[X]) / [√(E[X^2] - 2μX E[X] + μX^2) √(E[a^2X^2 + 2abX + b^2] - 2(aE[X] + b)μY + μY^2)]
Finally, notice that the denominator is always positive since we are considering expected values. Therefore, regardless of the values of E[X^2], E[X], and other terms, the negative signs in the numerator and denominator will cancel out, resulting in a correlation coefficient of ρ = 1.
Hence, we've shown that when a < 0, the correlation coefficient between X and Y is equal to 1.