The average selling price of homes in a certain city is $356,300. Assume the variable is normally distributed with a standard deviation of $64,600. If 396 homes are for sale, how many homes will sell for more than $325,000? (Round up to the next whole number.)
I don't understand how to do this problem. I have a test this weekend and want to understand. It won't be the same question, but one similar. Thanks.