An airplane is 48 miles due north of a town. An airport is 29 miles North-West of the town. It is determined that the airplane is 34 miles from the airport. On what bearing should the pilot fly to get to the airport?

Please show work if possibly, I'm having a hard time drawing the diagram. Thanks

If we label things as

T: town
P: plane
A: airport

Then in triangle TPA, using the law of cosines,

29^2 = 34^2+48^2 - 2*34*48*cosP
P = 36.6°

Now you can figure out the heading. (the same as the bearing of A from P)

To solve this problem, we can use trigonometry and the concept of bearings. A bearing represents the direction of an object relative to a reference point, usually the north direction. In this case, we are looking for the bearing the pilot should fly to reach the airport.

Let's start by drawing a diagram to visualize the given information:

```
A (Airport)
/
/
/ x (Airplane)
/
/
/
/
T (Town)
```

From the information provided:
- The airplane is 48 miles directly north of the town.
- The airport is 29 miles northwest of the town.
- The airplane is 34 miles away from the airport.

Now, let's find the missing information using trigonometry. We'll label the angle formed between the airplane, the town, and the airport as angle "A."

Using the cosine rule, we can find angle "A":

cos(A) = (b^2 + c^2 - a^2) / (2 * b * c)

where a, b, and c are the lengths of the sides of the triangle opposite to their respective angles. In this case:
- Side a is the distance between the airplane and the airport (34 miles).
- Side b is the distance between the airplane and the town (48 miles).
- Side c is the distance between the airport and the town (29 miles).

cos(A) = (48^2 + 29^2 - 34^2) / (2 * 48 * 29)
cos(A) = (2304 + 841 - 1156) / (2 * 1392)
cos(A) = 1989 / 2784
cos(A) ≈ 0.714

Let's find angle A by taking the arccosine (inverse cosine) of 0.714:

A ≈ arccos(0.714)
A ≈ 44.722 degrees

Now, to determine the compass bearing, we need to add or subtract this angle from the reference direction (north):

If we assume 0 degrees represents north, we need to subtract 44.722 degrees to find the bearing counterclockwise from north.

Therefore, the pilot should fly on a bearing of approximately 315.28 degrees (360 - 44.722 ≈ 315.28 degrees).

So, the answer is that the pilot should fly on a bearing of approximately 315.28 degrees to reach the airport.