6. A pilot flies a plane 310 miles west and then 420 miles north, where he lands the plane. How far is the pilot from his starting point? Round your answer to the nearest mile.

a. 283 miles
b. 365 miles
c. 522 miles
d. 619 miles

We can use the Pythagorean theorem to solve this problem. If we draw a right triangle with the horizontal distance traveled (310 miles) as one side and the vertical distance traveled (420 miles) as the other side, the distance the pilot is from his starting point is the hypotenuse of the triangle. We can use the formula:

c^2 = a^2 + b^2

where c is the distance from the starting point, and a and b are the horizontal and vertical distances traveled, respectively. Plugging in the values we get:

c^2 = 310^2 + 420^2
c^2 = 96100 + 176400
c^2 = 272500
c = sqrt(272500)
c = 522.48

Rounding to the nearest mile, we get answer (c) 522 miles.