Ms. Peters drove from her home to the park at an average speed of 30 miles per hour and returned home along the same route at an average speed of 40 miles per hour. If her driving time from home to the park was 20 minutes, how many minutes did it take Ms. peters to drive from the park to her home?

20 minutes (1/3 hr) at 30 mi/hr is 10 miles.

10mi/(40 mi/hr) = 1/4 hr = 15 minutes

or, you can reason that since her speed is 4/3 as great, the time required is 3/4 as much, or 15 minutes.

It was wrong!

To find the driving time from the park to Ms. Peters' home, we can apply the formula:

Time = Distance / Speed

First, let's convert the driving time from home to the park from minutes to hours. Since there are 60 minutes in an hour, 20 minutes is equivalent to 20/60 = 1/3 hour.

Now, let's calculate the distance from home to the park using the formula:

Distance = Speed * Time

Distance = 30 miles/hour * 1/3 hour = 10 miles

Now, we can use the distance and the average speed for the return journey to find the driving time from the park to Ms. Peters' home.

Time = Distance / Speed

Time = 10 miles / 40 miles/hour

Time = 1/4 hour

Finally, let's convert the driving time from the park to her home back to minutes. Since there are 60 minutes in an hour, 1/4 hour is equivalent to (1/4)*60 = 15 minutes.

Therefore, it took Ms. Peters 15 minutes to drive from the park to her home.