a car rental company has two rates. The first rate is 45$ per day plus 29.5 cents per mile the second rare is 90$ per day plus 0.07 per mile. If you rented at the 90$ rate, how many miles you would need to drive to break even?

in pennies:

4500 + 29.5 m = 9000 + 7 m

22.5 m = 4500

m = 200 miles

To determine how many miles you would need to drive to break even between the two rates, we need to set up an equation.

Let's call the number of miles driven "x" and the total cost of renting "y."

For the first rate, the equation would be: y = 45x + 0.295x. (The $45 per day is multiplied by the number of days rented, and the $0.295 per mile is multiplied by the number of miles driven.)

For the second rate, the equation would be: y = 90x + 0.07x. (The $90 per day is multiplied by the number of days rented, and the $0.07 per mile is multiplied by the number of miles driven.)

To find the break-even point, we equate the two equations and solve for "x" (miles driven):

45x + 0.295x = 90x + 0.07x.

Combining like terms on both sides, we get:

45.295x = 90.07x.

Subtracting 90.07x from both sides, we get:

45.295x - 90.07x = 0.

Simplifying further:

-44.775x = 0.

Dividing both sides by -44.775, we obtain:

x = 0.

This means that to break even between the two rates, you would need to drive zero miles.