A sociologist interested in cultural differences compared women of two ethinic groups on a Role Approval Index (high scores mean high degrees of approval of one's social role). The results are as follows:

Ethnic Group A: N = 15 M= 55 S2 = 6.5 Ethnic Group B : N = 23 M= 51 S2 = 4.5

If the standard deviation of the distribution of the difference between means is .76, what is the t score?

To find the t score, we first need to calculate the standard error of the difference between means.

The formula for calculating the standard error of the difference between means is:

SE = sqrt((s1^2 / n1) + (s2^2 / n2))

Where:
- s1 and s2 are the standard deviations of Group A and Group B, respectively.
- n1 and n2 are the sample sizes of Group A and Group B, respectively.

Using the given information:
s1 = sqrt(6.5) ≈ 2.55
s2 = sqrt(4.5) ≈ 2.12
n1 = 15
n2 = 23

SE = sqrt((2.55^2 / 15) + (2.12^2 / 23))
≈ sqrt(0.434 + 0.195)
≈ sqrt(0.629)
≈ 0.792

Next, we calculate the t score using the formula:

t = (M1 - M2) / SE

Where:
- M1 and M2 are the means of Group A and Group B, respectively.
- SE is the standard error of the difference between means.

Using the given information:
M1 = 55
M2 = 51
SE ≈ 0.792

t = (55 - 51) / 0.792
= 4 / 0.792
≈ 5.05

Therefore, the t score is approximately 5.05.