Posted by **Olivia** on Wednesday, November 7, 2012 at 11:39am.

Let’s suppose a randomly selected sample of 75 American teenagers watches an average of 1472 hours of television per year with a standard deviation of 250 hours. Construct the 95% confidence interval (CI) for the mean number of hours American teenagers watch television per year

## Answer This Question

## Related Questions

- statistics - To determine how many hours per week freshmen college students ...
- statistics - The amount of time spent by North American adults watching ...
- statistics - ten randomly selected people were asked how long they slept at ...
- statistics - In a random sample of 50 babies 4 to 7 month old the average number...
- statistics - In a random sample of 50 babies 4 to 7 month old the average number...
- statistics - A film alliance used a random sample of 50 U.S. citizens to ...
- math - In a time-use study, 20 randomly selected managers were found to spend a ...
- statistics - 2. The new Twinkle bulb has a standard deviation hours. A random ...
- statistics - In a time-use study, 20 randomly selected managers were found to ...
- statistics - In a time-use study, 20 randomly selected managers were found to ...

More Related Questions