Posted by **Crystal** on Friday, February 25, 2011 at 12:53pm.

If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)?

## Answer this Question

## Related Questions

- math - Statistics: Analyze student exam scores Students were given an exam with ...
- Statistics - Students were given an exam with 300 multiple-choice questions. The...
- Statistics - The distribution of IQ scores is a nonstandard normal distribution ...
- math - Students were given an exam with 300 multiple-choice questions. The ...
- Statistics - 1. Which of the following statements are correct? a. A normal ...
- statistics - 2. Why is it important to know the mean and standard deviation for ...
- Statistics - The Wechsler Adult Intelligence Scale (WAIS) is a common "IQ test" ...
- probability and statistics - Suppose every student in your class got a score of ...
- Statistics - What score in a distribution that has a mean of 500 and a standard ...
- Statistics - Suppose that two different tests A and B are to be given to a ...

More Related Questions