posted by James on .
"for 2000 patients, blood-clotting time was normally distributed with a mean of 8 seconds and a standard deviation of 3 seconds. What percent had blood-clotting times between 5 and 11 seconds
You usually need an error function table or computer tool to do this kind of problem. I am a bit surprised they teach this in algebra 2.
I like to use the JAVA tool at
The answer in this case is 68.3%
That will always be the answer when you are talking about the region from
one standard deviation below the mean to one standard deviation above the mean, as is the case here.