Posted by Shelby on .
A university has 1,000 computers available for students to use. Each computer
has a 250gigabyte hard drive. The university wants to estimate the space
occupied on the hard drives. A random sample of 100 computers showed a mean
of 115 gigabytes used with a standard deviation of 20 gigabytes.
I'm not clear on how to calculate these answers and would appreciate help a great deal. Thank you!
a) What is the standard error of the mean?
b) What is the probability that a sample mean is greater than 200 gigabytes?
c) What is the probability that a sample mean is greater than 120 gigabytes?
d) What is the probability that a sample mean is between 111 and 119
gigabytes?

Statistics 
Kuai,
A. The standard error of the mean = 20/sqrt([100]) = 2
answer 2
B. z =(200115)/(20/sqrt((100))
z = 42.5
answer 0
C. z = (120115)/2 = 2.5
answer 0. 9938
D. z = (111115) /2
z = 2
z = (119115)/2
z = 2
.9772.0228 = 0.9544
Answer 0.9544