A study of 36 marathon runners showed that they could run at an average rate of 7.8 miles per hour. The sample standard deviation is 0.6. Find a point estimate of the population mean. Find the 90% confidence interval forthe mean of all runners. Based on the results, what minimum speed should a runner obtain to qualify to run in a marathon? answer

Z = (score-mean)/SEm

SEm = SD/√n

90% = mean ± 1.645 SEm

What is your cutoff point to run?

To find the point estimate of the population mean, we simply use the average rate of the 36 marathon runners, which is given as 7.8 miles per hour.

To find the 90% confidence interval for the mean of all runners, we need to use the formula for a confidence interval:

Confidence Interval = sample mean ± (critical value) * (standard deviation / sqrt(sample size))

The critical value can be found using a standard normal distribution table or a calculator. For a 90% confidence interval, the critical value is approximately 1.645.

Calculating the confidence interval:

Confidence Interval = 7.8 ± 1.645 * (0.6 / sqrt(36))

= 7.8 ± 1.645 * (0.1)

= 7.8 ± 0.1645

= (7.6355, 7.9645)

Therefore, the 90% confidence interval for the mean of all runners is (7.6355, 7.9645).

To determine the minimum speed a runner should obtain to qualify to run in a marathon, we need to consider the lower limit of the confidence interval. From the calculation above, the lower limit is 7.6355 miles per hour.

Hence, a runner should obtain a minimum speed of approximately 7.64 miles per hour to qualify to run in a marathon based on the given data.