Eric has computed that it takes an average (mean) of 17 minutes with a standard deviation of 3 minutes to drive from home, park the car, and walk to his job. One day it took Eric 21 minutes to get to work. You would use the formula for transforming a raw score in a sample into a z-score to determine how many standard deviations the raw score represents. Since his "score" is 21, you would subtract the mean of 17 from 21 and divide that result (4) by the standard deviation of 3. The z-score of 1.33 tells you that Eric’s time to get to work is 1.33 standard deviations from the mean.

There are a lot of facts but I don't see a question.

To determine the z-score (also known as the standard score) of a raw score, you use the formula:

z = (X - μ) / σ

In this formula:
- X represents the raw score (which is 21 in this case)
- μ represents the mean (which is 17 minutes in this case)
- σ represents the standard deviation (which is 3 minutes in this case)

So, to calculate Eric's z-score, we can substitute these values into the formula:

z = (21 - 17) / 3
z = 4 / 3
z = 1.33

Therefore, Eric's z-score is 1.33, which tells us that his time to get to work is 1.33 standard deviations above the mean.