For the model X=\Theta +W, and under the usual independence and normality assumptions for \Theta and W, the mean squared error of the LMS estimator is

\frac{1}{(1/\sigma _0^2)+(1/\sigma _1^2)},

where \sigma _0^2 and \sigma _1^2 are the variances of \Theta and W, respectively.

Suppose now that we change the observation model to Y=3\Theta +W. In some sense the “signal" \Theta has a stronger presence, relative to the noise term W, and we should expect to obtain a smaller mean squared error. Suppose \sigma _0^2=\sigma _1^2=1. The mean squared error of the original model X=\Theta +W is then 1/2. In contrast, the mean squared error of the new model Y=3\Theta +W is

\frac{1}{(1/\sigma _0^2)+(1/(3^2\sigma _1^2))}=\frac{1}{1+\frac{1}{9}}=\frac{9}{10}

To find the mean squared error (MSE) of the new model Y = 3Θ + W, we can use the formula:

MSE = 1 / (1 / σ_0^2 + 1 / σ_1^2),

where σ_0^2 and σ_1^2 are the variances of Θ and W, respectively.

Given that σ_0^2 = σ_1^2 = 1, we can substitute these values into the formula:

MSE = 1 / (1 / 1 + 1 / 1).

Simplifying further:

MSE = 1 / (1 + 1) = 1 / 2.

Therefore, the mean squared error of the new model Y = 3Θ + W is 1/2.