You jog at 6.2 mi/h for 5.0 mi, then you jump into a car and drive for another 5.0 mi. With what average speed must you drive if your average speed for the entire 10.0 miles is to be 10.6 mi/h?

I don't get why the answer isn't 15.

The average speed for the entire trip is the total distance divided by the total time.

Therefore,
t1 = 5/6.2 = .806 hr.
t2 = 5/V
Total distance traveled is 5 + 5 = 10 miles.
Total time is t1 + t2 = .806 + 5/V
Average driving speed is therefore
Vavg = 10/(.806 + 5/V) = 10.6

Solving,V = 36.511mph.

(5 + 5)/(.806 + 5/36.511) = 10.6mph.

To find the average speed for the entire 10.0 miles, you need to consider the time taken to jog and the time taken to drive.

Let's first calculate the time taken to jog:
Distance = 5.0 miles
Speed = 6.2 miles per hour
Using the formula Time = Distance / Speed, we can find the time:
Time taken to jog = Distance / Speed = 5.0 miles / 6.2 miles per hour = 0.8064 hours.

Now, the remaining distance to cover is 5.0 miles, and the average speed for the entire 10.0 miles is 10.6 miles per hour. Let's denote the unknown time taken to drive this remaining distance as T.

We can express the average speed as the total distance divided by the total time:
Average speed = Total distance / Total time
10.6 miles per hour = 10.0 miles / (0.8064 hours + T hours)

Now, we can solve for T by rearranging the equation:
T = (10.0 miles / 10.6 miles per hour) - 0.8064 hours
T = 0.9434 hours

Since T represents the time taken to drive 5.0 miles, and the average speed is calculated as Total distance / Total time, we can find the average speed for driving:
Average speed for driving = Distance / Time = 5.0 miles / 0.9434 hours ≈ 5.3 miles per hour.

Therefore, to achieve an average speed of 10.6 miles per hour for the entire 10.0 miles, you must drive at an average speed of approximately 5.3 miles per hour, not 15 miles per hour.