statistics

posted by .

What is the standard deviation of a normal distribution, whose mean is 35, in which an x-value of 23 has a z-score of -1.63?

  • statistics -

    Use z-score formula and solve for standard deviation:

    z = (x - mean)/sd

    -1.63 = (23 - 35)/sd

    Solve for sd.

Respond to this Question

First Name
School Subject
Your Answer

Similar Questions

  1. Stastics

    When we convert x-value to z- score and use Standard Normal distribution, which of the following statements is true?
  2. MATH

    Which of the following normal distributions has the smallest spread?
  3. Math

    On a test whose distribution is approximately normal with a mean of 50 and a standard deviation of 10, the results for three students were reported as follows: Student Opie has a T-score of 60. Student Paul has a z-score of -1.00. …
  4. Stasistic

    On a test whose distribution is approximately normal with a mean of 50 and a standard deviation of 10, the results for three students were reported as follows: Student Opie has a T-score of 60. Student Paul has a z-score of -1.00. …
  5. statistics

    A normal distribution has a mean of ยต = 100 with standard deviation = 20. If one score is randomly selected from this distribution, what is the probability that the score will have a value between X = 80 and X = 100?
  6. statistics

    An SRS of size n is taken from a large population whose distribution of income is extremely right-skewed and the mean income is calculated. Which of the following statements is false?
  7. statistics

    For a normal distribution with a mean of 140 and a standard deviation of 55 find each value requested. a. What is the minimum score needed to be in the bottom 10% of the distribution?
  8. statistics

    2. Why is it important to know the mean and standard deviation for a data set when applying the empirical rule?
  9. Statistics

    The distribution of IQ scores is a nonstandard normal distribution with a mean of 100 and a standard deviation of 15. What are the value of the mean and standard deviation after all IQ score have been standardized by converting them …
  10. statistics

    a normal distribution has a mean of 20 and a standard deviation of 10. Two scores are sampled randomly from the distribution and the second score is subtracted from the first. What is the probability that the difference score will …

More Similar Questions