a motorist left a city for a 295 mile. 7 hour trip. for the first part of the trip, the average speed was 40mph. for the remainder of the trip, the average speed was 55mph. How long did the motorist drive 55mph

time of first leg --- x hrs

time of 2nd leg --- 7-x hrs

solve for x, then sub into 7-x

40x + 55(7-x) = 295

To determine how long the motorist drove at 55 mph, we need to calculate the distance covered during the first part of the trip when the average speed was 40 mph.

Let's start by finding the time taken for the first part of the trip. We know the total distance (295 miles) and the average speed (40 mph). We can use the formula:

Time = Distance / Speed

Time taken for the first part = 295 miles / 40 mph = 7.375 hours

Since we know that the total trip duration was 7 hours, and the motorist spent 7.375 hours on the first part, we can calculate the time spent on the remainder of the trip (at 55 mph) by subtracting the time spent on the first part from the total trip duration:

Time spent on the remainder of the trip = Total trip duration - Time taken for the first part
Time spent on the remainder of the trip = 7 hours - 7.375 hours

However, since the time taken for the first part exceeded the total trip duration, it means that the motorist drove at an average speed of 40 mph for the entire trip. Therefore, the motorist did not drive at 55 mph during any part of the trip.