Posted by badar on Monday, December 26, 2011 at 10:25am.
A sample of 400 high school students showed that they spend an average of 70 minutes a day watching television with a standard deviation of 14 minutes. Another sample of 500 college students showed that they spend an average of 55 minutes a day watching television with a standard deviation of 12 minutes.
a. Construct a 99% confidence interval for the difference between the mean times spent watching television by all high school students and all college students.
b. Test at the 2.5% significance level if the mean time spent watching television per day by high school students is higher than the mean time spent watching television by college students.

statistics  PsyDAG, Monday, December 26, 2011 at 2:06pm
Z = (mean1  mean2)/standard error (SE) of difference between means
SEdiff = √(SEmean1^2 + SEmean2^2)
SEm = SD/√n
If only one SD is provided, you can use just that to determine SEdiff.
Find table in the back of your statistics text labeled something like "areas under normal distribution" to find the proportion related to the Z score.
Answer This Question
Related Questions
 statistics  To determine how many hours per week freshmen college students ...
 math 120  A recent poll of a random sample of 400 students at a public 4year ...
 Dine  Men spend an average of 29 minutes per day on weekends and holidays ...
 Statistics  28. A group of statistics students decided to conduct a survey at ...
 Statistics  A recent study of US children under 11 years old found that they ...
 Statistics  A student group claims that firstyear students at a university ...
 Statistics  The average amount of time people spend on facebook each day is 71 ...
 Stat  Does anyone know how to solve for this Suppose that a principal of a ...
 Stat  Suppose that a principal of a local high school tracks the number of ...
 Statistical work  Suppose that a principal of a local high school tracks the ...
More Related Questions