Wednesday
March 29, 2017

Post a New Question

Posted by on .

P dollars equals to an interest rate r for T years= T=P+Prt. An investment of $10,000 grew to at least $12,000 in two years, what is the minimum interest rate at which can be invested ?
I'm not sure how to set this problem. I thought at first I would substitute the years and the dollars into the equation in place of T as 2 and the Dollars P as 10,000, but i do not know if that's how i am supposed to start it

  • Algebra 2 - ,

    You had the formula right, but in this case, you want an inequality, rather than an equation:

    10000 + 10000 * r * 2 >= 12000

    10000 * r * 2 >= 2000
    r >= 0.1

    so, you'd need at least 10% interest for the principal to grow from 10K to 12K in two years

Answer This Question

First Name:
School Subject:
Answer:

Related Questions

More Related Questions

Post a New Question