posted by Shelby .
A university has 1,000 computers available for students to use. Each computer
has a 250-gigabyte hard drive. The university wants to estimate the space
occupied on the hard drives. A random sample of 100 computers showed a mean
of 115 gigabytes used with a standard deviation of 20 gigabytes.
-I'm not clear on how to calculate these answers and would appreciate help a great deal. Thank you!
a) What is the standard error of the mean?
b) What is the probability that a sample mean is greater than 200 gigabytes?
c) What is the probability that a sample mean is greater than 120 gigabytes?
d) What is the probability that a sample mean is between 111 and 119
A. The standard error of the mean = 20/sqrt() = 2
B. z =(200-115)/(20/sqrt((100))
z = 42.5
C. z = (120-115)/2 = 2.5
answer 0. 9938
D. z = (111-115) /2
z = -2
z = (119-115)/2
z = 2
.9772-.0228 = 0.9544