Mr. Moore is trying to convince his high school students that it is better to slow down when they are learning to type. He believes that students who type too quickly make too many errors. To prove this point, Mr. Moore has the students take a typing test. He records their average typing speed (in words per minute), x, and the number of errors they made, y. The line of best fit/regression line of this data is: y=3.473x−155.317 (5 points) Using the regression line, about how many errors would be made if the students typed zero words per minute? For an increase of one word per minute typing speed, the regression line predicts an increase of how many errors? If a student types at 50 words per minute, how many errors does the regression line predict will be made? If a student types 70 words per minute, how many errors does the regression line predict will be made? Does this support Mr. Moore's claim that the slower a student types, the fewer errors will be made?

Using the regression line, we can calculate:

1. If x = 0:
y = 3.473(0) - 155.317
y = -155.317

Therefore, if the students type zero words per minute, the regression line predicts they will make approximately -155 errors. Since you can't have negative errors, this doesn't make sense in this context.

2. For an increase of one word per minute typing speed:
The coefficient of x in the regression line is 3.473, so for an increase of one word per minute typing speed, the regression line predicts an increase of 3.473 errors.

3. If x = 50:
y = 3.473(50) - 155.317
y = 173.783

Therefore, if a student types at 50 words per minute, the regression line predicts they will make approximately 173.783 errors.

4. If x = 70:
y = 3.473(70) - 155.317
y = 177.983

Therefore, if a student types at 70 words per minute, the regression line predicts they will make approximately 177.983 errors.

Based on the regression line, it appears that as typing speed increases, the number of errors also increases, which does not support Mr. Moore's claim.