A plane takes off from an airport and flies east at a speed of 350 mph. Ten minutes later, a second plane takes off from the same airport and flies east at a higher altitude at a speed of 400 mph. How long does it take the the second plane to overtake the first plane?

if t=0 when the second plane leaves, then you want t when

350(t + 1/6) = 400t
t = 7/6 or 1:10 hours

Or, consider that in 10 minutes, the 1st plane flies 58.333 miles. The 2nd plane, going 50 mph faster, then must take 58.333/50 = 7/6 hours to make up the difference

To determine how long it takes for the second plane to overtake the first plane, we can set up an equation based on their speeds.

Let's assume that the time it takes for the second plane to overtake the first plane is 't' hours.

In that case, the first plane would have been flying for (t + 10/60) hours, as it took off 10 minutes (1/6 hour) earlier.

We know that the first plane's speed is 350 mph and the second plane's speed is 400 mph.

Therefore, we can set up the following equation based on the distances traveled by each plane:

Distance traveled by the first plane = Distance traveled by the second plane

(speed of the first plane) * (time traveled by the first plane) = (speed of the second plane) * (time traveled by the second plane)

350 * (t + 10/60) = 400 * t

Now, we can solve this equation to find the value of 't'.

First, multiply both sides of the equation by 60 to clear the fraction:

350 * (t + 10/60) * 60 = 400 * t * 60

Simplifying further, we have:

350 * (t + 1/6) = 400 * t

Expand the equation:

350t + 350/6 = 400t

Combine like terms:

350t - 400t = -350/6

-50t = -350/6

Divide both sides by -50 to isolate 't':

t = (-350/6) / (-50)

Simplifying the fraction:

t = 7/6

Therefore, it takes the second plane 7/6 of an hour, or 70/60 minutes, to overtake the first plane.

In simplified form, the second plane overtakes the first plane in 1 hour and 10 minutes.