Suppose that the average score on the GMAT exam is 500 and that the standard deviation of all scores is 100 points. You would expect approximately 95% of all GMAT scores to be between

95% lie within 1.96 SD of the mean.

ok...i'm still looking for an aswer. be.t 330 and 630 ect?

Let's see. Can you multiply 1.96 by 100 and add/subtract that from 500?

To find the range within which approximately 95% of all GMAT scores fall, we need to use the concept of the empirical rule (also known as the 68-95-99.7 rule) that applies to data that follows a normal distribution. This rule states that for a normal distribution:

- Approximately 68% of the data falls within one standard deviation of the mean.
- Approximately 95% of the data falls within two standard deviations of the mean.
- Approximately 99.7% of the data falls within three standard deviations of the mean.

Given that the average score on the GMAT exam is 500 and the standard deviation is 100 points, we can apply the empirical rule to estimate the range:

1. Find the upper and lower limits within which approximately 95% of the data falls:
- The lower limit is the mean minus two standard deviations: 500 - (2 * 100) = 500 - 200 = 300.
- The upper limit is the mean plus two standard deviations: 500 + (2 * 100) = 500 + 200 = 700.

2. Therefore, approximately 95% of GMAT scores fall within the range of 300 to 700.

Please note that this is an approximation based on the empirical rule and assumes that the GMAT scores follow a normal distribution. In reality, the distribution of GMAT scores may deviate from a perfect normal distribution.