An aeroplane leaves an international airport and flies due north for 1hour at 500km/hr. It then flies for 30 minutes at 800 kilometres/ hour on a bearing of North 53 degrees East. Calculate it's distance and bearing from the airport

show all working

you fly on a heading, not a bearing.

Draw the diagram, then use the law of cosines to find the distance z:
z^2 = 500^2 + 400^2 - 2(500)(400) cos 127°
Relative to the airport, the plane is now at (x,y) where
y = 500 + 400cos57°
x = 400sin53°
So the plane's bearing is now (90-θ)°, where tanθ = y/x

To find the distance and bearing from the airport, we can break down the airplane's motion into two components: the northward component and the eastward component.

1. Northward Component:
The airplane flies due north for 1 hour at a speed of 500 km/h. Since speed = distance / time, the distance covered in the northward direction is:
Distance = Speed * Time
Distance = 500 km/h * 1 hour
Distance = 500 km

2. Eastward Component:
The airplane flies at a speed of 800 km/h on a bearing of North 53 degrees East for 30 minutes. We need to convert the time from minutes to hours:
Time = 30 minutes / 60 minutes/hour
Time = 0.5 hour

To find the distance covered in the eastward direction, we need to use trigonometry. The eastward component of the airplane's velocity is given by:
Eastward Component = Speed * cos(angle)
Eastward Component = 800 km/h * cos(53 degrees)

Now, let's calculate the eastward component:
Eastward Component = 800 km/h * cos(53 degrees)
Eastward Component = 800 km/h * 0.6
Eastward Component = 480 km/h

3. Total Distance:
The total distance from the airport is the vector sum of the northward and eastward components. We can use the Pythagorean theorem to calculate this distance:
Distance = sqrt(Northward Component^2 + Eastward Component^2)
Distance = sqrt(500 km^2 + 480 km^2)
Distance = sqrt(250,000 km^2 + 230,400 km^2)
Distance = sqrt(480,400 km^2)
Distance = 693 km (rounded to the nearest kilometer)

4. Bearing:
To find the bearing, we need to calculate the angle between the northward component and the total distance. We can use trigonometry to find this angle:
Angle = arctan(Eastward Component / Northward Component)
Angle = arctan(480 km / 500 km)

Now, let's calculate the angle:
Angle = arctan(480 km / 500 km)
Angle = arctan(0.96)
Angle ≈ 43.26 degrees (rounded to two decimal places)

Therefore, the distance from the airport is approximately 693 km and the bearing is approximately North 43.26 degrees East.

To calculate the distance and bearing of the airplane from the airport, we can break down the movements into two components: the northward movement and the eastward movement.

1. Northward Movement:
The airplane travels due north for 1 hour at a speed of 500 km/hr. Therefore, the distance covered in the north direction can be calculated as:
Distance North = Speed * Time
= 500 km/hr * 1 hr
= 500 km

2. Eastward Movement:
The airplane flies on a bearing of North 53 degrees East for 30 minutes at a speed of 800 km/hr. We first need to convert the time from minutes to hours:
Time (in hours) = 30 minutes / 60 minutes
= 0.5 hours

Next, we can calculate the distance covered in the east direction using the formula:
Distance East = Speed * Time
= 800 km/hr * 0.5 hr
= 400 km

Now, we can calculate the total distance from the airport using the Pythagorean theorem, as we have a right-angled triangle formed by the northward and eastward distances:
Total Distance = sqrt((Distance North)² + (Distance East)²)
= sqrt((500 km)² + (400 km)²)
≈ 640.31 km (rounded to two decimal places)

To find the bearing from the airport, we can use trigonometry. The angle formed from the north direction and the line connecting the airport to the airplane can be calculated as:
Angle = arctan(Distance East / Distance North)
= arctan(400 km / 500 km)
≈ 38.66 degrees (rounded to two decimal places)

Since the airplane is flying in a direction of North 53 degrees East from the airport, we need to find the bearing relative to the north. Therefore, the bearing from the airport can be calculated as:
Bearing = 360 degrees - Angle
= 360 degrees - 38.66 degrees
≈ 321.34 degrees (rounded to two decimal places)

Hence, the airplane is approximately 640.31 km away from the airport, and its bearing from the airport is approximately 321.34 degrees.