posted by jenny .
Given the following information about a hypothesis test of the difference between two means based on independent random samples, what is the standard deviation of the difference between the two means? Assume that the samples are obtained from normally distributed populations having equal variances.
H0: μA ≤ μB, and H1: μA > μB 1 = 12, 2 = 9, s1= 5, s2 = 3, n1 =13, n2 =10.
Standard deviation (standard error) of the difference between two means =
√[(s1^2 / n1) + (s2^2 / n2)]
I'll let you take it from here.