An airplane leaves airport A and flies 200 km. At this time its bearing from Airport B, 250 km to the west, is 120 degrees. How far is the airplane from B?

let the plane's position be P

so in triangle ABP, angle A = 30° ( a bearing of 120 = 90+30)
let's find angle P
sinP/250 = sin 30/200

(case 1)
angle P = 38.682
then angle A = 113.328°
BP/sin113.328 = 200/sin30
BP = 372.63 km

case 2
angle P = 141.318
then angle A = 8.682°
BP/sin8.682 = 200/sin30
BP = 60.38 km

both answers make sense, since the triangle results in the "ambiguous case"

1. A passenger in an airplane at an altitude of 10 km sees two towns to the east of the plane. The angle of depression to town A is 28o and the angle of depression to town B is 55o. How far apart are the two towns?

To find the distance between the airplane and airport B, we can use the Law of Cosines.

Let's consider the triangle formed by the airplane, the starting point (airport A), and airport B. We know the following information:
- Side a = 200 km (the distance the airplane has flown)
- Side b = 250 km (the distance from airport B to the west)
- Angle C = 120 degrees (the angle at airport B)

Using the Law of Cosines, we can write:

c^2 = a^2 + b^2 - 2ab * cos(C)

Substituting the given values:

c^2 = 200^2 + 250^2 - 2(200)(250) * cos(120)

Simplifying the equation:

c^2 = 40000 + 62500 - 100000 * cos(120)
c^2 = 40000 + 62500 + 100000 * (-0.5)
c^2 = 162500

Taking the square root of both sides:

c = sqrt(162500)
c = 403.11 km

Therefore, the airplane is approximately 403.11 km away from airport B.

To determine the distance of the airplane from Airport B, we can use the cosine rule, which relates the sides and angles of a triangle. In this case, we have a triangle with sides of 200 km, 250 km, and an unknown side, and an angle of 120 degrees opposite the unknown side.

Using the formula for the cosine rule:

c^2 = a^2 + b^2 - 2ab * cos(C)

where c is the unknown side (distance from the airplane to Airport B), a is the side opposite to angle A (200 km), b is the side opposite to angle B (250 km), and C is the angle opposite to the unknown side (120 degrees).

Plugging in the values into the formula:

c^2 = (200 km)^2 + (250 km)^2 - 2 * (200 km) * (250 km) * cos(120 degrees)

c^2 = 40000 km^2 + 62500 km^2 - 100000 km^2 * cos(120 degrees)

c^2 = 40000 km^2 + 62500 km^2 - 100000 km^2 * (-0.5)

c^2 = 40000 km^2 + 62500 km^2 + 50000 km^2

c^2 = 152500 km^2

Taking the square root of both sides to find the distance, c:

c = sqrt(152500 km^2)

c ≈ 390.62 km

Therefore, the airplane is approximately 390.62 km away from Airport B.