A university has 1000 computers available for students to use. Each computer has a 250 gigabyte hard drive. The university wants to estimate the space occupied on the hard drives. A random sample of 100 computers showed a mean of 115 gigabytes used with a standard deviation of 20 gigabytes. What is the standard error of the mean

standard error mean=20/sqrt(100)=+-2gigabites.

That is, the sample mean is within +-2 gigabites for the real mean 68 percent of the time.

To find the standard error of the mean, you can use the formula:

Standard Error of the Mean = Standard Deviation / Square Root of Sample Size

In this case, you have the standard deviation (20 gigabytes) from the sample of 100 computers. Now you just need to calculate the square root of the sample size (100) and divide the standard deviation by this value.

Standard Error of the Mean = 20 / sqrt(100)

The square root of 100 is 10, so:

Standard Error of the Mean = 20 / 10 = 2 gigabytes

Therefore, the standard error of the mean is 2 gigabytes.