I'm not sure what I'm doing wrong here. An airplane is 166 miles north and 83 miles east of an airport. The pilot wants to fly directly to the airport. What bearing should the pilot take? (Round your answer to one decimal place.)

I got S 210 degrees W, and the problem requires the S and W. Please help, thanks!

can't be true. The plane is NE of the airport, so t must fly SW to get there.

If θ = 40.8° then the bearing of the plane from the airport is (90-40.8) = 49.2°
So the bearing of the airport from the plane is
49.2+180 = 229.2°
That is the direction the plane must fly.

By the way, User, S210°W is just nonsense. Not a real direction notation.

You are correct, and I apologize for the confusion. The plane is indeed northeast of the airport, and the correct bearing for the pilot to take would be southwest.

Using the angle θ = 40.8° that we calculated, to determine the bearing of the airport from the plane, we need to add 180° to account for the opposite direction:

Bearing = 49.2° + 180°
= 229.2°

Therefore, the correct bearing for the pilot to take is approximately S 229.2° W.

Thank you for catching that error, and I apologize again for any confusion caused.

To determine the bearing that the pilot should take to fly directly to the airport, we can use trigonometry and calculate the angle between the airplane's position and the airport.

Let's consider the airplane's position as point A (166 miles north and 83 miles east of the airport), and the airport's position as point B.

First, let's calculate the distance between point A and B using the Pythagorean theorem:

AB = √((166)^2 + (83)^2)
= √(27556 + 6889)
= √34445
≈ 185.5 miles

Now, consider a right triangle with the legs AB (185.5 miles) and BC (166 miles, north direction). We can find the angle θ by using the inverse tangent:

θ = arctan(BC / AB)
= arctan(166 / 185.5)

Using a calculator, we find:
θ ≈ 40.8 degrees

Since the airplane is to the north and east of the airport, the bearing should be measured clockwise from the north direction. Therefore, the bearing that the pilot should take to fly directly to the airport is:

Bearing = N (90 degrees + 40.8 degrees) E
= N 130.8 degrees E

Rounded to one decimal place:
Bearing ≈ N 130.8 degrees E

Please note that the compass bearing format is typically expressed as three digits, with the first digit representing the direction from the north, followed by the angle clockwise from that direction.