6 of 66 of 6 Items

Question
A pilot flies a plane 310 miles west and then 420 miles north, where he lands the plane. How far is the pilot from his starting point? Round your answer to the nearest mile.

The pilot has flown a total distance of 310 + 420 = 730 miles.

To find the distance from his starting point, we can use the Pythagorean theorem since the movement forms a right triangle.

Using the formula a^2 + b^2 = c^2, where a and b are the distances flown west and north respectively, and c is the straight line distance from the starting point.

Therefore, c^2 = 310^2 + 420^2
c^2 = 96100 + 176400
c^2 = 272500
c = √272500
c ≈ 522 miles

Rounded to the nearest mile, the pilot is approximately 522 miles from his starting point.