an aeroplane leaves an airport,flies due north for two hours at 500km/hr.it then flies 450km on a bearing 053.how far is the plane from the airport and what is its bearing from bearing from east

using the law of cosines, the distance z is

z^2 = 1000^2 + 450^2 - 2*1000*450 cos127°
If the plane started at (0,0) then it ended at (359.4,1271)
Now you can figure the angle θ:
tanθ = 1271/359.4

Too snappy,not self explanatory pls I need the answer in a more explained form.

Well, well, well, looks like we have a plane that's ready for an adventure!

To find out how far the plane is from the airport, we can use the good ol' Pythagorean theorem. Let's break it down, shall we?

The plane flies due north for 2 hours at a speed of 500 km/hr, so it covers a distance of 2 hours * 500 km/hr = 1000 kilometers. That's a good start!

Next, it flies 450 kilometers on a bearing of 053. Now, here's where things get a bit trickier. We'll need to find the vertical and horizontal components of the distance traveled.

Using some trigonometry magic, we can find that the vertical distance traveled is 450 km * sin(37°), and the horizontal distance is 450 km * cos(37°).

Oh, who am I kidding? Let's just summon the power of some math functions!

The vertical distance is approximately 270.28 km, and the horizontal distance is approximately 361.05 km.

Now, let's put it all together:

The vertical distance from the airport is the sum of 1000 km (from flying north) and 270.28 km (from the bearing 053), which is approximately 1270.28 km.

The horizontal distance is 361.05 km (from the bearing 053).

For the final question, "what is its bearing from the east," well, we need to bring out another math tool called the inverse tangent function! Drumroll, please...

Using the inverse tangent function, we find that the angle of the horizontal distance with east comes out to be approximately 28.49°.

So, to sum it all up, the plane is approximately 1270.28 km away from the airport, and its bearing from east is approximately 28.49°.

I hope you enjoyed this little math-fueled adventure with our daredevil plane! Safe travels, my friend!

To determine the distance of the plane from the airport, we can break down the plane's movement into two components:

1. Northward movement: The plane flies due north for 2 hours at a speed of 500 km/hr. Therefore, the distance traveled in this direction is 2 hours * 500 km/hr = 1000 km.

2. Eastward movement: The plane flies 450 km on a bearing of 053. To determine the eastward component of this movement, we can multiply the total distance by the cosine of the angle. Therefore, the eastward distance is 450 km * cos(53°).

Therefore, to find the total distance, we can combine the northward and eastward components using the Pythagorean theorem:

Distance = √((northward distance)² + (eastward distance)²)

Distance = √((1000 km)² + (450 km * cos(53°))²)

To find the bearing from the east, we can use trigonometry:

tan(bearing) = (eastward distance) / (northward distance)

Let's perform the calculations:

Eastward distance = 450 km * cos(53°) = 298.82 km (rounded to 2 decimal places)

Distance = √((1000 km)² + (298.82 km)²) = 1069.16 km (rounded to 2 decimal places)

Bearing from east = arctan((298.82 km) / (1000 km)) ≈ 16.67° (rounded to 2 decimal places)

Therefore, the plane is approximately 1069.16 km away from the airport, and its bearing from the east is approximately 16.67°.

To find the distance of the plane from the airport and its bearing from the east, we need to break down the given information.

1. The plane flies due north for two hours at a speed of 500 km/h. Since it flies due north, its direction is 0° (from the north). The formula to calculate distance is speed x time, so the distance covered in this leg of the journey is 500 km/h x 2 hours = 1000 km.

2. After flying north, the plane then flies 450 km on a bearing of 053°. To find the distance and bearing from the airport, we need to break down the bearing into its component directions.

The value 053° can be expressed as the sum of the east (090°) and north (000°) components. So, we need to find the distance covered in the east and north directions separately.

The east component can be found using the formula distance = speed x time. Since the time is not given in this leg, we will use the distance and speed to calculate the time. Rearranging the formula, we get time = distance / speed. Thus, the time in the east direction is 450 km / 500 km/h = 0.9 hours.

The north component remains the same as it is the same distance as the previous leg, which is 1000 km, in the 0° direction.

3. To find the total distance from the airport, we need to combine the east and north components. By using the Pythagorean theorem, we can find the distance as follows:

Total distance = √(north component² + east component²)
= √((1000 km)² + (450 km)²)
= √(1,000,000 km² + 202,500 km²)
= √1,202,500 km²
≈ 1097 km (rounded to the nearest kilometer)

4. To find the bearing from the east, we can use trigonometry. Since the east direction is 090°, and the north component is 1000 km, we can use the inverse tangent function:

Bearing from east = arctan(north component / east component)
= arctan(1000 km / 450 km)
= arctan(2.22)
≈ 66.96° (rounded to two decimal places)

Therefore, the plane is approximately 1097 km away from the airport, and its bearing from the east is approximately 66.96°.