A machine on an assembly line fills cans with quantities of food that are normally distributed with a standard deviation of 0.057 pounds. The mean quantity filled is estimated using a sample of 100 cans. What is the difference between the upper and lower limits of the 95% confidence interval for the mean?

A.0.0057 lb.
B.0.011 lb.
C.0.022 lb.
D.0.11 lb.
E.0.22 lb.

Since 95% of the samples will be within 2 sigma (standard deviations) of the mean, that range will be 0.114 lb wide.

The answer is D

To find the difference between the upper and lower limits of the 95% confidence interval for the mean, we need to calculate the margin of error and then multiply it by 2.

The margin of error can be found using the formula: Z * (standard deviation / square root of sample size)

Since the sample size is 100, the standard deviation is 0.057 pounds, and we want a 95% confidence interval, we need to find the value of Z for a 95% confidence level. For a normal distribution, the Z value is approximately 1.96.

Plug these values into the formula to calculate the margin of error: 1.96 * (0.057 / √100) = 1.96 * (0.057 / 10) = 0.01112 pounds.

The difference between the upper and lower limits of the 95% confidence interval is twice the margin of error, so: 2 * 0.01112 = 0.02224 pounds.

Therefore, the correct answer is C. 0.022 lb.