By measuring the amount of time it takes a component of a product to move from one workstation to the next, an engineer has estimated that the standard deviation is 6 seconds.
(a) How many measurements should be made to be 90% certain that the maximum error of estimation will not exceed 1 seconds?
(b) What sample size is required for a maximum error of 2 seconds?
So far I have E= 1.65(6/square root of 1).
To determine the required sample size in order to estimate the maximum error of estimation, we can use the formula:
n = (Z * σ / E)²
Where:
n = sample size
Z = Z-score corresponding to the desired confidence level
σ = standard deviation
E = maximum error of estimation
(a) For a 90% confidence level and a maximum error of 1 second:
Z = 1.645 (corresponding to a 90% confidence level, as you were close with 1.65)
σ = 6 seconds
E = 1 second
Plugging in these values, we have:
n = (1.645 * 6 / 1)²
n = 16.1976
Round up the sample size to the nearest whole number, so we need at least 17 measurements to be 90% confident that the maximum error of estimation will not exceed 1 second.
(b) For a maximum error of 2 seconds, the equation remains the same:
Z = 1.645
σ = 6 seconds
E = 2 seconds
n = (1.645 * 6 / 2)²
n = 12.2976
Round up to the nearest whole number, so we need at least 13 measurements for a maximum error of 2 seconds.