An airplane leaves an airport and flies due west 150 miles and then 170 miles in the direction 210°50'. Assuming the Earth is flat, how far is the plane from the airport at this time (to the nearest mile)

150 at 180° = (-150,0)

170 at 210°50' = (-146,-87)

total: (-296,-87)

distance = √(296^2 + 87^2) = 309 mi

I am assuming that 0° is east. If you are using 0° = north, then things have to be changed a bit.

To find the distance between the airplane and the airport at this time, we can use the law of cosines. This law states that in any triangle, the square of one side is equal to the sum of the squares of the other two sides, minus twice the product of the two sides and the cosine of the included angle.

Let's label the sides of the triangle formed as follows:
Side a = 150 miles (westward)
Side b = 170 miles (at an angle of 210°50')
Side c = distance between the airplane and the airport (which we need to find)

Now, we can use the law of cosines to calculate the distance:
c^2 = a^2 + b^2 - 2ab * cos(C)

First, let's convert the angle from degrees, minutes, and seconds to decimal degrees:
210°50' = 210 + 50/60 = 210.8333°

Now, we can substitute the values into the equation:
c^2 = 150^2 + 170^2 - 2 * 150 * 170 * cos(210.8333°)

Calculating this expression gives us:
c^2 ≈ 127641.95

Taking the square root of both sides, we get:
c ≈ √127641.95

Rounding the result to the nearest mile:
c ≈ 357.31 miles

Therefore, the approximate distance between the airplane and the airport at this time is 357 miles.

To find the distance of the plane from the airport, we can use the concept of vector addition.

First, let's convert the given angle to degrees. The angle 210°50' can be written as:

210 + (50/60) = 210.8333°

Now, we can break down the distance and direction into horizontal and vertical components.

The airplane flies due west for 150 miles, which means it covers 150 miles horizontally and 0 miles vertically.

Next, let's calculate the horizontal and vertical components for the second leg of the flight. To do so, we use trigonometry.

The angle 210.8333° can be interpreted as the direction angle from the positive x-axis (east). Since the airplane is heading in the southwest direction, we need to find the components in the x (horiztonal) and y (vertical) directions.

The horizontal component can be calculated as:

170 * cos(210.8333°)

Similarly, the vertical component can be calculated as:

170 * sin(210.8333°)

Now, we can add up the horizontal components and vertical components separately.

Horizontal Component: 150 miles (due west) + 170 * cos(210.8333°)

Vertical Component: 0 miles + 170 * sin(210.8333°)

To find the distance of the plane from the airport, we can use the Pythagorean theorem:

Distance = square root of ((Horizontal Component)^2 + (Vertical Component)^2)

Calculate the Distance using the horizontal and vertical components. Round the final answer to the nearest mile to get the distance of the plane from the airport.