A population has u=100 and o=20. If you select a single score from this population, on average, how close would it be to the population mean? Explain your answer.

I don't understand the question itself. Please help.

Do you know the 68-95-99.7 rule? Approximately 68% of scores in normal distribution are within one standard deviation (34% on each side of the mean), 95% within 2 SD, and 99.7% within 3 SD.

The question is asking about the expected closeness of a randomly selected score from a population to the population mean. To answer this, we need to understand the concepts of mean (u) and standard deviation (o) in statistics.

The mean (u) represents the average value of the population. In this case, the mean is 100, which means that the average value of the population scores is 100.

The standard deviation (o) measures the amount of variability or spread in the population. In this case, the standard deviation is 20, which means that the scores within the population are, on average, 20 units away from the mean.

To determine how close a randomly selected score would be to the population mean on average, we can consider the concept of the standard error (SE). The standard error represents the average distance of individual scores from the mean. In this case, the standard error is equal to the population standard deviation divided by the square root of the population size.

SE = o / √n

where o is the standard deviation and n is the population size.

Since we are considering a single score from the population, the population size (n) is 1. Therefore, the standard error in this case is:

SE = 20 / √1 = 20

So, on average, a randomly selected score from this population is expected to be about 20 units away from the population mean of 100.