Posted by **badar** on Monday, December 26, 2011 at 10:25am.

A sample of 400 high school students showed that they spend an average of 70 minutes a day watching television with a standard deviation of 14 minutes. Another sample of 500 college students showed that they spend an average of 55 minutes a day watching television with a standard deviation of 12 minutes.

a. Construct a 99% confidence interval for the difference between the mean times spent watching television by all high school students and all college students.

b. Test at the 2.5% significance level if the mean time spent watching television per day by high school students is higher than the mean time spent watching television by college students.

- statistics -
**PsyDAG**, Monday, December 26, 2011 at 2:06pm
Z = (mean1 - mean2)/standard error (SE) of difference between means

SEdiff = √(SEmean1^2 + SEmean2^2)

SEm = SD/√n

If only one SD is provided, you can use just that to determine SEdiff.

Find table in the back of your statistics text labeled something like "areas under normal distribution" to find the proportion related to the Z score.

## Answer This Question

## Related Questions

- math 120 - A recent poll of a random sample of 400 students at a public 4-year ...
- statistics - To determine how many hours per week freshmen college students ...
- Statistics - 28. A group of statistics students decided to conduct a survey at ...
- Statistics - A student group claims that first-year students at a university ...
- statistics - A sample of 40 CDs from a student's collection showed a mean length...
- Dine - Men spend an average of 29 minutes per day on weekends and holidays ...
- Statistics - A recent study of US children under 11 years old found that they ...
- Statistics - the average length of time for students to register for fall ...
- statistics - The average math SAT score is 500 with a standard deviation of 100...
- Statistics - The standard deviation of GPAs (grade point averages) of all male ...

More Related Questions