P dollars equals to an interest rate r for T years= T=P+Prt. An investment of $10,000 grew to at least $12,000 in two years, what is the minimum interest rate at which can be invested ?

I'm not sure how to set this problem. I thought at first I would substitute the years and the dollars into the equation in place of T as 2 and the Dollars P as 10,000, but i do not know if that's how i am supposed to start it

You had the formula right, but in this case, you want an inequality, rather than an equation:

10000 + 10000 * r * 2 >= 12000

10000 * r * 2 >= 2000
r >= 0.1

so, you'd need at least 10% interest for the principal to grow from 10K to 12K in two years

To solve this problem, we can start by using the formula T = P + Prt, where T represents the total value after interest, P is the principal investment (starting amount), r is the interest rate (as a decimal), and t is the number of years.

In this case, we want to find the minimum interest rate at which a $10,000 investment will grow to $12,000 in two years. Let's set up the equation using the given values:

$12,000 = $10,000 + $10,000 * r * 2

Now, we can simplify this equation:

$12,000 = $10,000 + $20,000 * r

Next, we can isolate the interest rate (r) by subtracting $10,000 from both sides of the equation:

$2,000 = $20,000 * r

Now, divide both sides of the equation by $20,000:

r = $2,000 / $20,000

Simplifying further, we get:

r = 1/10

This means that the minimum interest rate at which the investment can be made is 1/10 or 0.1 as a decimal. In percentage form, it is 10%.

Therefore, the minimum interest rate at which the investment can be made is 10%.