Boston is 200 mi away from NY. It takes me 1 hour less to drive to Boston when I drive 10 mi faster than my normal rate. What is my normal rate?

See your 1st post for the answer.

And, please check your original post for an answer before reposting.

I apologize. The answer hadn't been posted when you posted this.

To solve this problem, let's break it down step by step:

Step 1: Let's assume your normal rate is "x" miles per hour.
Step 2: If you drive 10 miles per hour faster than your normal rate, then your speed would be "x + 10" miles per hour.
Step 3: The distance from NY to Boston is 200 miles, and it takes you 1 hour less to drive to Boston at the faster speed.
Step 4: So, if you drive at your normal rate, it would take you "200 / x" hours to reach Boston.
Step 5: And if you drive 10 miles per hour faster, it would take you "200 / (x + 10)" hours to reach Boston.
Step 6: According to the problem, the difference in time is 1 hour. So, we can set up the following equation:

200 / x - 200 / (x + 10) = 1

Step 7: Now, let's solve the equation. First, we can simplify the equation by getting rid of the fractions:

[(200)(x + 10) - 200x] / (x(x + 10)) = 1

Step 8: Simplifying further, we get:

200x + 2000 - 200x = x(x + 10)

Step 9: Now, simplify the equation:

2000 = x(x + 10)

Step 10: Distribute and get a quadratic equation:

x^2 + 10x - 2000 = 0

Step 11: Solve the quadratic equation either by factoring, completing the square, or using the quadratic formula. In this case, the equation factors easily:

(x - 40)(x + 50) = 0

Step 12: Setting each factor equal to zero, we have:

x - 40 = 0 or x + 50 = 0

Step 13: Solving for x, we get:

x = 40 or x = -50

Step 14: Since speed cannot be negative in this context, we discard the x = -50 solution.

Therefore, your normal rate is 40 miles per hour.