A pilot wishes to reach an airport 350 Km due east from his present position. The wind is blowing from N25degreesE with a speed of 80 Km/h and the planes airspeed is 400 Km/h. What ground speed and what heading should the pilot steer to reach the airport? How many minutes will it take the plane to reach the airport?

To determine the ground speed and heading the pilot should steer, we need to use vector addition.

Step 1: Determine the wind's velocity components
The wind is blowing from N25°E, which means it has a North component and an East component. To find these components, we use basic trigonometry.

North component of wind's velocity = wind speed * sin(angle)
East component of wind's velocity = wind speed * cos(angle)

Using the given wind speed of 80 km/h and angle N25°E, we can calculate the components:
North component = 80 km/h * sin(25°)
East component = 80 km/h * cos(25°)

Step 2: Determine the resultant ground speed
The resultant ground speed is the vector sum of the plane's airspeed and the wind's velocity. We can use the Pythagorean theorem to find it.

Ground speed = square root((plane's airspeed + East component of wind's velocity)^2 + (North component of wind's velocity)^2)

Ground speed = square root((400 km/h + East component)^2 + (North component)^2)

Step 3: Determine the heading
The heading is the angle at which the pilot should steer the plane. We can use inverse trigonometric functions to find it.

Heading = inverse tangent(North component of wind's velocity / (plane's airspeed + East component of wind's velocity))

Step 4: Determine the time
To find the time it takes for the plane to reach the airport, we divide the distance by the ground speed and convert it to minutes.

Time = Distance / Ground speed
Time = 350 km / Ground speed

Now let's plug in the values and calculate:

North component = 80 km/h * sin(25°) ≈ 34.53 km/h
East component = 80 km/h * cos(25°) ≈ 73.34 km/h

Ground speed = square root((400 km/h + 73.34 km/h)^2 + (34.53 km/h)^2) ≈ 428.41 km/h

Heading = inverse tangent(34.53 km/h / (400 km/h + 73.34 km/h)) ≈ 4.24°

Time = 350 km / 428.41 km/h ≈ 0.817 hours (approximately 49 minutes)

So, to reach the airport, the pilot should steer approximately 4.24° heading (measured clockwise from due east) and maintain a ground speed of approximately 428.41 km/h. It will take around 49 minutes for the plane to reach the airport.

bjnjlnjn