Monday
July 6, 2015

Homework Help: Algebra 2

Posted by Anonymous on Saturday, October 1, 2011 at 12:46pm.

P dollars equals to an interest rate r for T years= T=P+Prt. An investment of $10,000 grew to at least $12,000 in two years, what is the minimum interest rate at which can be invested ?
I'm not sure how to set this problem. I thought at first I would substitute the years and the dollars into the equation in place of T as 2 and the Dollars P as 10,000, but i do not know if that's how i am supposed to start it

Answer this Question

First Name:
School Subject:
Answer:

Related Questions

More Related Questions

Members