posted by Anonymous .
A computer manufacturer is about to unveil a new, faster personal computer. The new machine clearly is faster, but initial tests indicate there is more variation in the processing time. The processing time depends on the particular program being run, the amount of input data, and the amount of output. A sample of 16 computer runs, covering a range of production jobs, showed that the standard deviation of the processing time was 22 (hundredths of a second) for the new machine and 12(hundredths of a second) for the current machine. At the .05 significance level can we conclude that there is more variation in the processing time of the new machine?
Indicate your subject in the "School Subject" box, so those with expertise in the area will respond to the question.