Posted by **Crystal** on Friday, February 25, 2011 at 12:53pm.

If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)?