Y x1 x2

57 220 125
43 215 147
73 250 118
37 241 160
64 305 128
48 258 149
56 354 145
50 321 150
39 370 140
43 375 115
69 385 155
60 385 152
Calculate
1, the fitted line regression
2, the coefficient of multiple determination
3, test for individual coefficient
4, test overall significance of the model

To calculate the fitted line regression, we can use the formula:

Y' = b0 + b1*x1 + b2*x2

Where Y' is the predicted value of Y, b0 is the y-intercept, b1 and b2 are the coefficients for x1 and x2 respectively.

To find the coefficients (b0, b1, b2), we need to use the least squares method. The equations for b0, b1, and b2 are:

b1 = Σ((xi1 - x¯1)(yi - ȳ)) / Σ((xi1 - x¯1)^2)
b2 = Σ((xi2 - x¯2)(yi - ȳ)) / Σ((xi2 - x¯2)^2)
b0 = ȳ - b1*x¯1 - b2*x¯2

where xi1 and xi2 are the values of x1 and x2 for each observation i, x¯1 and x¯2 are the means of x1 and x2 respectively, yi is the value of Y for each observation i, and ȳ is the mean of Y.

Calculating the means:

x¯1 = (220 + 215 + 250 + 241 + 305 + 258 + 354 + 321 + 370 + 375 + 385 + 385) / 12 = 317.83
x¯2 = (125 + 147 + 118 + 160 + 128 + 149 + 145 + 150 + 140 + 115 + 155 + 152) / 12 = 139.08
ȳ = (57 + 43 + 73 + 37 + 64 + 48 + 56 + 50 + 39 + 43 + 69 + 60) / 12 = 52.75

Using the given data, we can calculate the following:

Σ((xi1 - x¯1)(yi - ȳ)) = -2175.67
Σ((xi2 - x¯2)(yi - ȳ)) = -73.25
Σ((xi1 - x¯1)^2) = 79912.33
Σ((xi2 - x¯2)^2) = 18833.00

Plugging these values into the formulas, we get:

b1 = (-2175.67) / 79912.33 = -0.02722
b2 = (-73.25) / 18833.00 = -0.00389
b0 = 52.75 - (-0.02722 * 317.83) - (-0.00389 * 139.08) = 59.692

Therefore, the fitted line regression equation is:

Y' = 59.692 - 0.02722*x1 - 0.00389*x2

To calculate the coefficient of multiple determination (R^2), we use the formula:

R^2 = 1 - (SSR / SST)

where SSR is the sum of squared residuals and SST is the total sum of squares.

SSR = Σ((yi - Y')^2) = 3.218
SST = Σ((yi - ȳ)^2) = 254.75

Plugging these values into the formula, we get:

R^2 = 1 - (3.218 / 254.75) = 0.98734

Therefore, the coefficient of multiple determination is approximately 0.98734.

To test for individual coefficients, we can use the t-test. The null hypothesis is that the coefficient is equal to zero, while the alternative hypothesis is that the coefficient is not equal to zero.

To calculate the t-values for b1 and b2, we use the formulas:

t1 = b1 / SE(b1)
t2 = b2 / SE(b2)

where SE(b1) and SE(b2) are the standard errors of b1 and b2 respectively.

SE(b1) = √(SSR / (n - k - 1)) / √((Σ(xi1 - x¯1)^2))
SE(b2) = √(SSR / (n - k - 1)) / √((Σ(xi2 - x¯2)^2))

where n is the number of observations, and k is the number of independent variables (2 in this case).

Using the given values, we can calculate:

SE(b1) = √(3.218 / (12 - 2 - 1)) / √79912.33 = 0.000425
SE(b2) = √(3.218 / (12 - 2 - 1)) / √18833.00 = 0.00107

Using these values, we can calculate the t-values:

t1 = -0.02722 / 0.000425 = -64.03
t2 = -0.00389 / 0.00107 = -3.64

To test the individual coefficients, we compare the absolute values of the t-values to critical values from the t-distribution. At a significance level of 0.05 and degrees of freedom equal to the number of observations minus the number of independent variables minus 1 (12 - 2 - 1 = 9), the critical t-value is approximately 2.262.

Since |t1| > 2.262 and |t2| > 2.262, we can conclude that both coefficients are statistically significant.

To test the overall significance of the model, we can use the F-test. The null hypothesis is that all coefficients are equal to zero, while the alternative hypothesis is that at least one coefficient is not equal to zero.

To calculate the F-value, we use the formula:

F = (SSR / k) / (SSE / (n - k - 1))

where SSE is the sum of squared errors and k is the number of independent variables.

SSE = SST - SSR = 251.532
n - k - 1 = 12 - 2 - 1 = 9

Plugging these values into the formula, we get:

F = (3.218 / 2) / (251.532 / 9) = 5.146

To test the overall significance, we compare the F-value to the critical F-value from the F-distribution. At a significance level of 0.05 and degrees of freedom equal to the number of independent variables and the numerator degrees of freedom (2 and 9 respectively), the critical F-value is approximately 4.26.

Since F > 4.26, we can conclude that the model is overall statistically significant.