Let A and B be independent random variables with means 1 , and variances 1 and 2 , respectively.

Let X=A−B and Y=A+B .

Find the coefficients c1 and c2 of the Linear Least Mean Squares (LLMS) estimator YˆLLMS=c1X+c2 of Y based on X .

E(X) = E(A-B) = 0

E(Y) = E(A+B) = 2
Var(X) = Var(A-B) = Var (A) + Var (B) = 3
Var(Y) = Var(A+B) = Var (A) + Var (B) = 3
Cov(X,Y)=E(XY)-E(X)E(Y)
=E(A-B*A+B)- [E(A-B)E(A+B)] = Var(A)-Var(B) = -1
...
Y_llms = 2- x/3
c1 = -1/3
c2 = 2

c1=-1/3

c2=2

i'm getting 0 for cov(x,y), anyone else?

cov(x,y) = cov(A-B, A+B) = cov(A,A) + cov(A,B) + cov(-B,A) + cov(-B,B) = 1 + 0 + 0 -1 = 0 ?

ignore the one above, cov(-B, B) = -var(B) = -2

To find the coefficients c1 and c2 of the Linear Least Mean Squares (LLMS) estimator Y^LLMS = c1X + c2 of Y based on X, we need to minimize the mean squared error between the estimated value Y^LLMS and the actual value Y.

The mean squared error (MSE) is given by the formula: MSE = E[(Y - Y^LLMS)^2]

We want to find c1 and c2 that minimize this MSE. Let's calculate the MSE.

Since X = A - B and Y = A + B, we can rewrite Y^LLMS as Y^LLMS = c1(A - B) + c2.

Substituting Y^LLMS in the MSE formula, MSE = E[(Y - c1(A - B) - c2)^2].

Expanding the square and distributing, MSE = E[(Y^2 - 2c1Y(A - B) + c1^2(A - B)^2 - 2c2Y + 2c1c2(A - B) + c2^2)].

We can rewrite this MSE expression as MSE = E[Y^2] - 2c1E[Y(A - B)] + c1^2E[(A - B)^2] - 2c2E[Y] + 2c1c2E[A - B] + c2^2.

Now, let's calculate each expectation term step by step.

1. E[Y^2]: Since A and B are independent random variables, we can calculate this expectation separately.
E[Y^2] = E[(A + B)^2] = E[A^2 + 2AB + B^2]
= E[A^2] + 2E[AB] + E[B^2]
= Var[A] + E[A]^2 + 2E[A]E[B] + E[B]^2 + Var[B]
= 1 + 1^2 + 2(1)(1) + 1^2 + 2
= 8.

2. E[Y(A - B)]: Again, since A and B are independent, we can calculate this expectation separately.
E[Y(A - B)] = E[(A + B)(A - B)]
= E[A^2 - B^2]
= E[A^2] - E[B^2]
= Var[A] + E[A]^2 - Var[B]
= 1 + 1^2 - 2
= 0.

3. E[(A - B)^2]: This is simply the variance of (A - B).
E[(A - B)^2] = Var[A - B]
= Var[A] + Var[B]
= 1 + 2
= 3.

4. E[Y]: This is the expectation of Y.
E[Y] = E[A + B]
= E[A] + E[B]
= 1 + 1
= 2.

5. E[A - B]: This is the expectation of (A - B).
E[A - B] = E[A] - E[B]
= 1 - 1
= 0.

Now, substituting these expectation values back into the MSE expression, we have
MSE = 8 - 2c1(0) + c1^2(3) - 2c2(2) + 2c1c2(0) + c2^2
= 8 + 3c1^2 - 4c2 + c2^2.

To minimize this MSE, we take the partial derivatives with respect to c1 and c2, and set them equal to zero.

d(MSE)/d(c1) = 6c1 - 2c2 = 0 --> c1 = c2/3 (equation 1),
d(MSE)/d(c2) = -4 + 2c2 = 0 --> c2 = 2 (equation 2).

Substituting the value of c2 from equation 2 into equation 1, we get c1 = 2/3.

Therefore, the coefficients of the Linear Least Mean Squares (LLMS) estimator Y^LLMS = c1X + c2 of Y based on X are c1 = 2/3 and c2 = 2.