Alek's mean on 10 tests for the quarter was 89. He complained to the teacher that he should be given an A because he missed the cutoff of 90 by only a single point. Explain whether it is clear that he really missed an A by only a single point.

He missed an A by earning less than 90 on several tests.

he got an A

To determine whether Alek missed an A by only a single point, we need to calculate the total score he received on the 10 tests for the quarter. Given that his mean or average score was 89, we know that the sum of his scores on the 10 tests is:

Sum of scores = Mean × Number of tests = 89 × 10 = 890

Now, to determine whether Alek missed an A by only a single point, we need to know the grading scale or cutoffs for each grade. Let's assume that an A requires a minimum score of 90.

In this case, we subtract Alek's total score (890) from the minimum required for an A (90):

Minimum A grade = 90
Alek's total score = 890

Difference = Minimum A grade - Alek's total score = 90 - 890 = -800

In this scenario, the difference between Alek's total score and the minimum required for an A is -800. This means that Alek fell short of an A by 800 points, not just a single point. Therefore, it is not clear that he missed an A by only a single point – his score is significantly below the minimum requirement.