Please help me with this math problem.
A woman is given a loan of $20,000. for 1 year. If the instrest charged is $800, the was was the intrest rate on the loan?
To get the answer I set the problem up like this:
R= 800/20,000= 25(1/100)= 25/100= 25%
but I know this cant be right. When I go back and multiply the rate times the base(20,000) I get $5,000. Please tell me what I have done wrong.
I got 4%. I divided 800 and 20000 and got .04 which = 4%... but I could be very wrong.
Angela -- you set up the problem correctly -- but somehow didn't get the right answer. You probably divided 20,000 by 800.
Anonymous is right. 800/20,000 = 0.04 = 4%.
To find the interest rate on a loan, you need to use the formula:
Interest = Principal * Rate * Time
In this case, the principal (P) is $20,000, the interest (I) is $800, and the time (T) is 1 year. Since you want to find the rate (R), you can rearrange the formula:
Rate = Interest / (Principal * Time)
Plugging in the known values:
Rate = 800 / (20,000 * 1)
Rate = 800 / 20,000
Rate = 0.04
The interest rate on the loan is 0.04 or 4%.
Your initial approach was incorrect. You divided the interest ($800) by the principal ($20,000), but you didn't consider the time (1 year). By not including the time component, you obtained an incorrect interest rate of 25%.
To correctly find the interest rate, always make sure to include all the necessary variables in the formula.