The Murphy family just drove 300 miles to visit their grandparents. Mr. Murphy drove

70 mph for 65% of the trip and
35 mph or less for 20% of the trip that was left. Assuming that Mr. Murphy never went over 70 mph, how many miles did he travel at a speed between 35 and 70 mph?

I can't remember the formula, we did this a couple of years ago

distance at 70 mph = .65(300) or 195 miles

trip left = .20(105) = 21 miles

distance traveled between 35 and 70 = 300-195-21 = 84

To calculate the number of miles Mr. Murphy traveled at a speed between 35 and 70 mph, we need to first determine the distances traveled at both speeds separately.

Step 1: Calculate the distance traveled at 70 mph.
Given that Mr. Murphy drove at 70 mph for 65% of the trip, we can calculate this distance using the formula:

Distance at 70 mph = 65% of 300 miles

Calculating this, we have:
Distance at 70 mph = 0.65 × 300 miles = 195 miles

Step 2: Calculate the distance traveled at 35 mph or less.
Given that Mr. Murphy drove at 35 mph or less for 20% of the remaining trip, we can calculate this distance using the formula:

Distance at 35 mph or less = 20% of (total trip - distance at 70 mph)

Calculating this, we have:
Distance at 35 mph or less = 0.20 × (300 miles - 195 miles) = 0.20 × 105 miles = 21 miles

Step 3: Calculate the distance traveled between 35 and 70 mph.
Since Mr. Murphy never went over 70 mph, the remaining distance must be traveled between 35 and 70 mph. Therefore, we need to subtract the distances calculated in Step 1 and Step 2 from the total distance.

Distance between 35 and 70 mph = Total distance - (Distance at 70 mph + Distance at 35 mph or less)

Calculating this, we have:
Distance between 35 and 70 mph = 300 miles - (195 miles + 21 miles) = 84 miles

Therefore, Mr. Murphy traveled 84 miles at a speed between 35 and 70 mph.