An airplane leaves an airport and flies due west 150 miles and then 230 miles in the direction S 39.67 degrees W. How far is the plane from the airport at this time to the nearest mile.

.id=150 Mi @ 180Deg. + 230 Mi @ 230.33Deg

X=150*cos180 + 230*cos230.33=-296.82 Mi
Y=150*sin180+230*sin230.33 = -177.04 Mi

d = sqrt((-296.82)^2+(-177.04)^2) = 346 Miles.

v = -(170 + 240 sin(69.5)) i - 240 cos(69.5) j .... is the resulting vector

|| v || = √((170 + 240 sin(69.5))² + (240 cos(69.5))²)
= 403.64892 mi

Answer: 404 mi

Limy is wrong; Henry is correct because im looking at this problem right now on my test review. The answer choices do not include 404.

To find the distance of the plane from the airport, we can use the concept of vector addition.

First, we need to convert the direction S 39.67 degrees W into Cartesian coordinates (x, y). The given direction can be divided into a southward component (S) and a westward component (W).

The southward component can be calculated using trigonometry:
S = 230 * sin(39.67°)

The westward component can also be calculated using trigonometry:
W = 230 * cos(39.67°)

Once we have the southward and westward components, we can add them to the initial position of the airplane, which was due west 150 miles.

Adding the westward component and subtracting the southward component:
X = 150 - W
Y = -S

To find the distance from the airport to the plane, we can use the Pythagorean theorem:

Distance^2 = X^2 + Y^2

Finally, we can calculate the distance by taking the square root of the sum of the squares of X and Y:

Distance = sqrt(X^2 + Y^2)

Let's calculate the distance to the nearest mile: