An airport passes over an airport and continues flying on a heading of N 70¡ãW for 3 km. The airplane then turns left and flies another 2 km until the airport is exactly due east of its position. What is the distance between the airplane and the airport, to the nearest tenth of kilometer?

To find the distance between the airplane and the airport, we can use the law of cosines. Let's break down the problem and find the necessary information step-by-step.

Step 1: Draw a diagram

Draw a diagram to visualize the problem. Label the airport as point A and the airplane as point B. Draw a line segment connecting the two points, representing the initial distance between them.

Step 2: Determine the known distances and angles

From the information given, you know that the airplane flew 3 km on a heading of N 70° W, and then turned left and flew another 2 km. Sketch the vectors to help with the visualization.

Step 3: Find the third side of the triangle

To calculate the distance between the airplane and the airport, you need to find the length of the third side of the triangle, opposite angle A. This can be done using the Law of Cosines:

c² = a² + b² - 2ab * cos(C)

In our scenario, a = 3 km, b = 2 km, and C = 70°.

c² = 3² + 2² - 2(3)(2) * cos(70°)
c² = 9 + 4 - 12 * cos(70°)

Using a calculator, we can evaluate the expression:

c² ≈ 9 + 4 - 12 * 0.342
c² ≈ 9 + 4 - 4.104
c² ≈ 8.896

Step 4: Find the distance between the airplane and the airport

The distance between the airplane and the airport is the square root of c²:

c ≈ √8.896
c ≈ 2.98 km

So, the distance between the airplane and the airport is approximately 2.98 km (to the nearest tenth of a kilometer).

To solve this problem, we can use trigonometry and apply the Law of Cosines.

First, let's create a diagram to illustrate the given information. The airplane initially flies on a heading of N 70° W for 3 km. Then, it turns left and flies another 2 km until the airport is due east of its position.

```
A (airport)
|
|
3 km|
N 70° W ----+----> B (airplane)
|
|
|
2 km |
|
|
C
```

Let's label the points: A represents the airport, B represents the airplane's initial position, and C represents the airplane's final position.

Using the Law of Cosines, we have the following formula:

c^2 = a^2 + b^2 - 2ab * cos(C)

In this situation, we are looking for the length of side c, which is the distance between the airplane and the airport.

Here's how we can calculate it:

1. First, calculate the angle C:
Since the airplane initially flies on a heading of N 70° W and then turns left (which is equivalent to turning 90° clockwise), we can find angle C by subtracting 70° from 90°:
C = 90° - 70° = 20°

2. Now, we can use the Law of Cosines to find the distance (c) between the airplane and the airport:
c^2 = 3^2 + 2^2 - 2 * 3 * 2 * cos(20°)
c^2 = 9 + 4 - 12 * cos(20°)

Using a calculator, evaluate cos(20°) and simplify the equation:
c^2 ≈ 9 + 4 - 12 * 0.9397
c^2 ≈ 13 - 11.2764
c^2 ≈ 1.7236

Taking the square root of both sides, we get the distance between the airplane and the airport:
c ≈ √1.7236
c ≈ 1.3127 km

So, the distance between the airplane and the airport, rounded to the nearest tenth of a kilometer, is approximately 1.3 km.

Well, it seems like this airplane is playing a game of hide and seek with the airport! Let's see if we can figure out where they are.

First, the airplane flies on a heading of N 70° W for 3 km. That's a pretty precise heading! Maybe the pilot wanted to make sure they could enjoy the beautiful sunset while flying. Anyway, after that, the airplane decides to make a left turn, because right turns can be so mainstream.

Now, the airport is exactly due east of the airplane's position. Ah, I see what the airplane is doing! It's trying to surprise the airport from a different angle! "Hey, I'm over here now, Mr. Airport! Can you guess where I am?"

So, with all this fancy flying happening, we need to determine the distance between the airplane and the airport. To do that, we need a little bit of geometry.

If we draw this on a graph, we'll have a triangle formed by the airplane, the airport, and the distance between them. Let's say the airplane's starting point is point A, the final position is point C, and the airport is point B.

Now, let's connect points A and C. This line represents the path the airplane took after turning. We also draw a line connecting points A and B, which represents the original heading and distance the airplane flew.

To find the distance between the airplane and the airport, we need to find the length of side AB. Since we know the lengths of sides AC (2 km) and BC (3 km), we can use the Pythagorean theorem to solve for AB.

AB² = AC² + BC²

AB² = 2² + 3²

AB² = 4 + 9

AB² = 13

Taking the square root of both sides, we find AB ≈ √13.

Rounding to the nearest tenth, the distance between the airplane and the airport is approximately 3.6 kilometers.

So, the airport better be prepared for a surprise, because the airplane is about 3.6 kilometers away, ready to reveal its new location with a flourish!

If the airport is at (0,0)

first leg is to (-2.819,1.026)
the second leg is on a course of S20°W for a distance of x where

1.026-xcos20° = 0
x = 1.092
So, the 2nd leg is to (-3.192,0)