Two planes depart from the same airport at the same time. One hour later one of the planes is 480 miles due north of the airport and the other plane is 550 miles due east of the airport. At that time, what is the distance between the two planes?

Use the Pythagorean Theorem to find the diagonal route that connects them.

a^2 + b^2 = c^2

Using the pythagorean theorem, x^2 =480^2+550^2 x^2 =230400+302500 x^2 =532900 x = square root of 532900 x = 730miles

To find the distance between the two planes, we can use the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, we have a right triangle with one side measuring 480 miles (the distance north) and the other side measuring 550 miles (the distance east). We want to find the length of the hypotenuse, which represents the distance between the two planes.

Using the Pythagorean theorem, we can calculate the distance between the two planes:

Distance^2 = 480^2 + 550^2

Distance^2 = 230,400 + 302,500

Distance^2 = 532,900

Taking the square root of both sides, we find:

Distance ≈ √532,900

Distance ≈ 729.52 miles

Therefore, the approximate distance between the two planes is 729.52 miles.