If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)? 


Z = (score-mean)/SD

You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.

To understand the result of subtracting the mean and dividing by the standard deviation, we need to talk about the concept of standardization, also known as Z-scores.

Standardization is a way to measure how far away a particular data point is from the mean, in terms of standard deviations. It allows us to compare different values from different distributions.

In this case, we have an IQ distribution with a mean of 100 and a standard deviation of 16. Let's say we have an IQ score of 120. To standardize this score, we subtract the mean (100) and divide by the standard deviation (16).

Standardized score = (IQ score - Mean) / Standard deviation

Standardized score = (120 - 100) / 16
= 20 / 16
= 1.25

So, if we first subtract the mean (100) from each score and then divide by the standard deviation (16) for an IQ score of 120, the result would be a standardized score of 1.25. This indicates that the IQ score of 120 is 1.25 standard deviations above the mean.

If we subtract the mean of 100 from each score and then divide by the standard deviation of 16, we would be standardizing the scores. This process is known as standardization or z-score transformation.

To calculate the standardized score for each IQ test score, we can use the formula:

Standardized Score = (Raw Score - Mean) / Standard Deviation

In this case, the mean is 100 and the standard deviation is 16. So, the formula becomes:

Standardized Score = (Raw Score - 100) / 16

By plugging in the values, you can calculate the standardized score for each IQ test score.