In a film study, a sample of 300movies was found to have an average runtime of 116minutes with a standard deviation of 15minutes. What is the margin of error for any sample of movie runtimes with 90% confidence ( z=1.64 )?

A) 10.98%
B) 11.6%
C) 1.89%
D) 1.42%

The margin of error can be calculated using the formula:

Margin of Error = z * (standard deviation / √n)

where z is the z-score for the desired confidence level, standard deviation is the sample standard deviation, and n is the sample size.

Given:
z = 1.64
standard deviation = 15 minutes
n = 300

Plugging these values into the formula, we get:

Margin of Error = 1.64 * (15 / √300)
= 1.64 * (15 / 17.32)
= 1.64 * 0.866
≈ 1.42%

Therefore, the margin of error for any sample of movie runtimes with 90% confidence is approximately 1.42%.

Answer: D) 1.42%