So if a scale has 30 items in it, and if an individual gets a score of 20 what does this mean? The only way to answer this is to look at the standardization sample, find out what the mean score in the sample was, let's say 15, and standard deviation in the population was five then a score of 20 is one standard deviation above the mean. If we take the mean raw score and make it equal 50 and then take the standard deviation score and make that equal to 10 then a score of 20 on the scale is equal to what number?

Assuming that each item on the scale has a value of one, how could the mean be 50?

Are you talking about merely a number (raw score, 20 seems to be the raw score) or a Z score? If the latter,

Z = (score-mean)/SD

thanks!

You're welcome.

To determine the answer, we need to apply a conversion formula using the mean and standard deviation of the scale. Let's go step by step:

1. Calculate the z-score (standardized score) for the individual's score of 20 in relation to the standardization sample. The formula for calculating the z-score is:

z = (X - μ) / σ

Where:
- X is the individual's score (20)
- μ is the mean score of the standardization sample (15)
- σ is the standard deviation of the standardization sample (5)

Plugging in the values:
z = (20 - 15) / 5
z = 1

So, a score of 20 corresponds to a z-score of 1, indicating that it is one standard deviation above the mean.

2. Now, let's convert this z-score to the desired scale, where the mean is 50 and the standard deviation is 10.

Using the formula for converting z-scores to raw scores:

X = (z * σ) + μ

Where:
- X is the raw score we want to find
- μ is the mean of the desired scale (50)
- σ is the standard deviation of the desired scale (10)

Plugging in the values:
X = (1 * 10) + 50
X = 10 + 50
X = 60

Therefore, a score of 20 on the original scale would be equivalent to a score of 60 on the scale with a mean of 50 and a standard deviation of 10.