A sample of n=20 has a mean of M = 40. If the standard deviation is s=5, would a score of X= 55 be considered an extreme value? Why or why not?

To determine if a score of X=55 is considered an extreme value, we need to examine how far it is from the mean in terms of standard deviations. This can be done by calculating the "z-score" of the value.

The z-score measures the number of standard deviations a particular score is away from the mean. It is calculated using the formula:
z = (X - M) / s

where X is the score, M is the mean, and s is the standard deviation.

Plugging in the values from the problem:
X = 55, M = 40, and s = 5,
we can calculate the z-score as follows:
z = (55 - 40) / 5
z ≈ 3

A z-score of 3 indicates that the score of 55 is 3 standard deviations above the mean.

Whether a score is considered extreme depends on the context. In a normal distribution, the vast majority of scores fall within 3 standard deviations of the mean, making a z-score of 3 quite rare. Typically, a z-score greater than 2 (or less than -2) is considered to be an extreme value.

In this case, a z-score of 3 for a score of 55 suggests that it would indeed be considered an extreme value.