An aeroplane leave an airport and flies due north for 1.5 hours at 500Km/h on a bearing 053°. Calculate its final distance and bearing from the airport

To calculate the final distance and bearing of the airplane from the airport, we can use trigonometry and vector addition.

First, let's break down the given information:
- The airplane flies due north for 1.5 hours at a speed of 500 km/h.
- It flies on a bearing of 053°.

To determine the final position from the starting position, we need to find the horizontal (east-west) and vertical (north-south) components using trigonometry.

The north-south component can be found by multiplying the time (1.5 hours) by the speed (500 km/h):
North-south component = 1.5 hours * 500 km/h = 750 km.

To find the east-west component, we need to calculate the cosine of the angle between due north and the direction of the airplane (bearing of 053°):
East-west component = 750 km * cos(053°).

By substituting the value of 053° into the cosine function, we can find its decimal value, which is approximately 0.6018.

East-west component = 750 km * 0.6018 = 451.35 km.

Now, we can find the final distance from the airport using the Pythagorean theorem:
Final distance = √((east-west component)^2 + (north-south component)^2)
Final distance = √((451.35 km)^2 + (750 km)^2)
Final distance ≈ 871.75 km.

To calculate the final bearing, we can use the inverse tangent function.

Final bearing = arctan(north-south component / east-west component)
Final bearing = arctan(750 km / 451.35 km)
Final bearing ≈ 58.25°.

Therefore, the airplane's final distance from the airport is approximately 871.75 km, and its final bearing is about 58.25°.