Calculus
posted by Alexis .
An air traffic controller spots 2 planes at the same altitude converging on a point as they fly at right angles to each other. One plane is 150 miles from the point moving at 450 miles per hour. The other plane is 200 miles from the point moving at 600 miles per hour. At what rate is the distance between the planes decreasing? How much time does the air traffic controller have to get one of the planes on a different flight path?

At the moment in question, the distance d = 250
x^2 + y^2 = d^2
2xx' + 2yy' = 2dd'
2(150)(450) + 2(200)(600) = 2(250)d'
Now, find d'