A jet pilot is making a 1000 mile trip in her military Tomcat. She

flies 800 mph for one hour and decides she wants to finish the trip so that she would have flown at an
average of 1000 mph. How fast should she go to finish the trip and meet her goal?(Hint: Think carefully about what is meant by \average speed".)

To find the speed the pilot needs to maintain to achieve an average speed of 1000 mph for the entire trip, we need to consider the total distance and total time of the journey.

The total distance of the trip is 1000 miles. The pilot has already flown 800 miles in one hour, which means there are 200 miles left to cover.

Let's denote the speed the pilot needs to maintain for the remaining distance as 'x' mph. We'll set up an equation to solve for 'x'.

The time taken to cover the remaining 200 miles at speed 'x' is given by: time = distance / speed

Using the formula for average speed, we can write: average speed = total distance / total time

The total distance of the trip is 1000 miles, and the total time is the sum of the time taken to cover the first 800 miles at 800 mph and the time taken to cover the remaining 200 miles at 'x' mph.

So, the equation becomes: 1000 mph = (800 miles / 800 mph) + (200 miles / x mph)

Simplifying this equation, we get: 1000 = 1 + (200 / x)

To find the value of 'x', we subtract 1 from both sides of the equation: 1000 - 1 = 200 / x

Now, we invert both sides of the equation: 1 / (1000 - 1) = x / 200

Simplifying further, we get: 1 / 999 = x / 200

To isolate 'x', we multiply both sides of the equation by 200: 200 / 999 = x

Therefore, the pilot needs to maintain a speed of approximately 200 / 999 mph, or about 0.20020 mph (rounded to five decimal places), to achieve an average speed of 1000 mph for the entire trip.