Many times, we are required to use statistical measures to try and construct a problem. We run a program for a 10 different inputs. The times are measures in 1-second intervals and none of them took 0 secs.

a. Suppose the standard deviation of a set of times we run the program is 0. What does this tell you about the running times?

b. Suppose that the mean of the times is 1000.9 sec while the median is 1 sec. Explain what do you know about the program running times for all 10 different inputs?

c. Assume now that the mean of the times is 1000.9 sec while the median is 1 sec and the variance is 9998000. Explain what do you know about the program running times for all 10 different inputs

Can someone help me with step by step solution to this problem. Thank you in advance.

1000.9 seconds = 16.68 minutes

a. They are all on the mean.

b. Distribution is extremely skewed.

c. SD = √variance, but it is only an accurate estimate of variability for a normal distribution.