A tech company develops a navigation app for smartphones that can compute the usual distance it takes to get from one location to another. The company collects location data from 100 smartphones to determine how long it takes to drive from Cleveland, Ohio, to Detroit, Michigan.

The company finds that it takes an average of 2.78 hours to drive this distance with a standard deviation of 0.06. The driving times appear to be normally distributed.

This company wants to provide an estimate of a range of driving times that include the driving times for 95% of users.

What would this range be?

a) 0 to 2.66 hours
b) 2.60 to 2.96 hours
c) 1.82 to 3.74 hours
d) 2.66 to 2.90 hours

so which one? a, b, c, or d?

The answer is D for people in the future :)

Hmmmmmmm, I'm just as confused as Mrs. Jessica.

To find the range of driving times that include the driving times for 95% of users, we need to calculate the confidence interval.

First, we need to determine the critical value using the Z-table for a 95% confidence level. Since the driving times appear to be normally distributed, we can use the Z-table to find the critical value. The critical value for a 95% confidence level is approximately 1.96.

Second, we need to calculate the margin of error, which is the product of the critical value and the standard deviation. In this case, the margin of error is (1.96 * 0.06) = 0.1176.

Finally, we can find the range by subtracting and adding the margin of error to the average driving time. The range would be (2.78 - 0.1176) to (2.78 + 0.1176), which simplifies to 2.66 to 2.896.

So, the correct answer is d) 2.66 to 2.90 hours.

start with the mean

add 2 s.d. for the max

subtract 2 s.d. for the min

95% of the population is within 2 s.d. of the mean