A sociologist interested in cultural differences compared women of two ethinic groups on a Role Approval Index (high scores mean high degrees of approval of one's social role). The results are as follows:

Ethnic Group A: N = 15 M= 55 S2 = 6.5 Ethnic Group B : N = 23 M= 51 S2 = 4.5
If the standard deviation of the distribution of the difference between means is .76,
what is the t score?


- (15-23)/.76 = -10.53

- (.76)(15 - 23) = -8.00

- [(6.5 + 4.5)/][.76] = 4.18

- (55 - 51)/.76 = 5.26

To quote one of our very good math and science tutors: “You will find here at Jiskha that long series of questions, posted with no evidence of effort or thought by the person posting, will not be answered. We will gladly respond to your future questions in which your thoughts are included.”

To compute the t-score, we need to divide the difference between the means by the standard deviation of the distribution of the difference between means. Let's use the correct calculation:

- (55 - 51) / 0.76 = 5.26

Therefore, the t-score is 5.26.