Tuesday
March 28, 2017

Post a New Question

Posted by on .

1. A random sample of 100 computers showed a mean of 115 gigabytes used with a standard deviation of 20 gigabytes. What is the standard error of the mean?

  • statistics - ,

    Standard error of the mean:

    Standard deviation divided by the square root of the sample size.

Answer This Question

First Name:
School Subject:
Answer:

Related Questions

More Related Questions

Post a New Question