Mary drives 20 miles per hour and it takes 50 minutes to reach her destination. How much time would it take her to get there if she was driving 40 miles per hour?

My answer: She doubled her miles per hour, so I subtracted 20 from 50 minutes and came up with 30 minutes as the answer. Is that correct?

d = r * t

twice the speed means half the time

she doubled speed so took half the time !

half of 50 minutes is 25 minutes

No, your calculation is incorrect. Let's break it down and calculate the correct answer.

To find the time it would take for Mary to reach her destination when driving at 40 miles per hour, we need to consider the relationship between speed, distance, and time.

We know that time = distance / speed.

In the given situation:
Speed = 20 miles per hour
Time = 50 minutes

To determine the distance Mary traveled, we can rearrange the formula as distance = speed * time.

Distance = 20 miles per hour * (50 minutes / 60 minutes per hour)
Distance = 20 miles per hour * 0.8333 hours
Distance = 16.67 miles

Now let's find the time it would take if Mary drove at 40 miles per hour.

Speed = 40 miles per hour
Distance = 16.67 miles

Using the formula time = distance / speed:
Time = 16.67 miles / 40 miles per hour
Time = 0.4167 hours

To convert this time to minutes, multiply by 60:
Time = 0.4167 hours * 60 minutes per hour
Time = 25 minutes

Therefore, if Mary was driving at 40 miles per hour, it would take her approximately 25 minutes to reach her destination.