A plane is 48 miles west and 49 miles north of an airport. The pilot wants to fly directly to the airport. What bearing should the pilot take? In degrees and minutes

E arctan(49/48) S = E 45°35' S = 135°45'

Actually, that is the pilot's heading.
It also happens to be the bearing of the airport from the plane.

60

45.6 degree

To determine the bearing the pilot should take to fly directly to the airport, we can use trigonometry to find the angle of the triangle formed by the airport and the plane's location.

First, we need to find the hypotenuse of the triangle, which represents the direct distance between the plane and the airport. Using the Pythagorean theorem, we can calculate:

Hypotenuse^2 = (48 miles)^2 + (49 miles)^2
Hypotenuse^2 = 2304 miles^2 + 2401 miles^2
Hypotenuse^2 = 4705 miles^2

Taking the square root of both sides, we find:

Hypotenuse ≈ 68.59 miles

Next, we can find the angle using trigonometric functions. Since the pilot wants to fly directly to the airport, the angle we are interested in is the one opposite the 48-mile side (westward direction).

Using the tangent function:

tan(θ) = opposite/adjacent = 48 miles/49 miles
tan(θ) ≈ 0.9796

To find the angle θ, we can take the inverse tangent (arctan) of the result:

θ ≈ arctan(0.9796)

Using a calculator, we find:

θ ≈ 44.79 degrees

Now we have the angle in decimal degrees. To convert it to degrees and minutes, we can express the decimal part as minutes.

Decimal Part = 44.79 - 44 = 0.79

Converting the decimal part to minutes:

Minutes = Decimal Part * 60
Minutes = 0.79 * 60
Minutes ≈ 47.4

Therefore, the bearing the pilot should take to fly directly to the airport is approximately 44 degrees and 47 minutes.

Since the given:

-48 miles west (plane)
-49 miles north of an airport

Let h be the bearing.

tan h= (opposite/adjacent)
tan h= (49/48)
h= tan-1 (49/48)
h= 45° 35’ 26.37”