Post a New Question


posted by .

If a distribution of IQ test scores has a mean of 100 and a standard deviation of 16, what would be the result if we first subtract the mean (100) from each score, and then divide each of those resulting scores by the standard deviation (16)? 

  • statistics -

    Z = (score-mean)/SD

    You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.

  • statistics -


Answer This Question

First Name:
School Subject:

Related Questions

More Related Questions

Post a New Question