Traveling to Washington is a 500-mile trip, that took 5 hours. The average speed of the car on the first part of the trip was 105 mph, and on the second part of the trip was 115 mph. How long did the car drive at both speeds.

Where is the speed limit that high?

To find the amount of time the car drove at each speed, we can use the formula:

Time = Distance / Speed

Let's first calculate the distance traveled at each speed:

Distance traveled at 105 mph:
Since the average speed of the car during the entire trip was not provided, we need to calculate the distance traveled at each speed separately.
Let's assume the car drove at speed X for some time, and at speed Y for the remaining time. We want to find X and Y.

We know that the total distance traveled is 500 miles, and the total travel time is 5 hours.
So, X + Y = 500 (equation 1) ---(1)

We also know that the average speed is calculated as the total distance divided by the total time:
Average speed = (Distance traveled at 105 mph + Distance traveled at 115 mph) / Total time
Average speed = 500 miles / 5 hours
Average speed = 100 mph

Considering the average speed formula, we can further write:
(105 mph * X + 115 mph * Y) / (X + Y) = 100 mph

Rearrange it:
105 mph * X + 115 mph * Y = 100 mph * (X + Y)
105 mph * X + 115 mph * Y = 100 mph * 500 (Using equation 1)
105 mph * X + 115 mph * Y = 50,000 ---(2)

Now, we have a system of equations:
105X + 115Y = 50,000 ---(2)
X + Y = 500 ---(1)

We can solve this system of equations to find the values of X and Y.