an aeroplane leaves an airport,flies due north for two hours at 500km/hr.it then flies 450km on a bearing 053.how far is the plane from the airport and what is its bearing from bearing from east

1st leg goes (0,2*500) = (0,1000)

2nd leg goes (450sin53°,450cos53°) = (359,271)

so, the plane ends up at (359,1271)
The distance is thus 1321 @ E74°N

To find the distance from the airport, you can use the concept of the Pythagorean theorem.

First, let's calculate the distance traveled in the north direction:

Distance = Speed x Time
Distance = 500 km/hr x 2 hr
Distance = 1000 km

Now, let's calculate the distance traveled on the bearing 053. Since this bearing is in relation to the north direction, we can consider it as the angle between the north direction and the path of the airplane.

To find the horizontal distance traveled, we can use trigonometry. We can consider the distance traveled in the north direction as the opposite side and the total distance traveled as the hypotenuse. The horizontal distance will be the adjacent side.

Adjacent side = Hypotenuse x cosine(angle)
Adjacent side = 1000 km x cos(53°)
Adjacent side ≈ 617.65 km

Therefore, the distance from the airport (taking into account both the north and east directions) is approximately:

Distance = √(Distance north² + Distance east²)
Distance = √(1000 km² + 617.65 km²)
Distance ≈ √(1000000 km² + 381064.4 km²)
Distance ≈ √1381064.4 km²
Distance ≈ 1174.54 km

So, the plane is approximately 1174.54 km away from the airport.

To find the bearing from the east, we need to determine the angle formed between the east direction and the path of the airplane.

Since the north direction is known and the angle between the north direction and the airplane's path is given (053°), we can find the angle between the east direction and the path of the airplane using the following formula:

Angle east = 90° - Angle north
Angle east = 90° - 053°
Angle east = 37°

Therefore, the bearing from the east is approximately 37°.