An jet flies between two points on the ground that are 500 km apart with an airspeed of 200 m/s. The destination is directly north of the point of origin of the flight.. If a constant wind blows at 10 m/s toward the west during the flight, what direction must the plane fly relative to the air to arrive at the destination?

tan-1 (10/200)

To determine the direction in which the plane must fly relative to the air, we can use vector addition.

Let's break down the velocities involved:

1. Airspeed of the jet: 200 m/s directed to the north (as given).
2. Wind speed: 10 m/s directed towards the west.

First, we need to figure out the resultant velocity of the wind and air together. The wind is blowing towards the west and the plane is flying north. Since these velocities are perpendicular to each other, we can use the Pythagorean theorem to find the resultant velocity:

Resultant velocity = √(airspeed² + wind speed²)
= √(200² + 10²)
= √(40000 + 100)
= √40100
= 200.25 m/s (approximately)

The resultant velocity is the velocity of the plane relative to the ground. Now we can find the direction relative to the air in which the plane must fly to arrive at the destination.

Let's consider a right-angled triangle with the airspeed of the plane as one of the legs, and the wind speed as the other leg. The hypotenuse of this triangle represents the resultant velocity. Using trigonometry, we can find the angle between the hypotenuse and the leg representing the airspeed.

sin(θ) = Opposite/Hypotenuse
sin(θ) = airspeed/resultant velocity
θ = sin^(-1)(airspeed/resultant velocity)
θ = sin^(-1)(200/200.25)
θ ≈ 89.94 degrees

Therefore, the plane must fly about 89.94 degrees (counterclockwise from due east) relative to the air to arrive at the destination.