An airplane leaves the airport and flies due west 170 miles and then 240 miles in the direction 200°30'. Assuming the earth is flat, how far is the plane from the airport at this time( to the nearest mile)?

To find the distance of the airplane from the airport, we can use the Pythagorean theorem.

First, let's consider the distances traveled in the north-south and east-west directions separately.

The airplane flies 170 miles due west, so this is the distance traveled in the east-west direction.

Next, we need to find the distance traveled in the north-south direction. To do this, we need to calculate the component of the 240 miles that points north or south.

The direction is given as 200°30', which means it is 200 degrees and 30 minutes clockwise from north. To convert minutes to degrees, we divide by 60. So, 30 minutes is equal to 30/60 = 0.5 degrees.

Now, we need to find the component of 240 miles that points north or south, given the angle of 200°30'. We can use trigonometry to do this.

The component of the distance in the north-south direction is given by:
240 * sin(200°30')

Using a calculator, we find that sin(200°30') is approximately -0.5736. Multiplying this by 240, we get:
-0.5736 * 240 = -137.664

Since the direction is 200°30', which is south, the component is negative.

Now, we can use the Pythagorean theorem to find the distance from the airport:

Distance from the airport = √(170^2 + (-137.664)^2)

Using a calculator, we find that √(170^2 + (-137.664)^2) is approximately 218.4

Therefore, the airplane is approximately 218.4 miles from the airport at this time (to the nearest mile).

To solve this problem, we can use the Law of Cosines. The Law of Cosines states that in any triangle, the square of one side equals the sum of the squares of the other two sides, minus twice the product of those two sides, multiplied by the cosine of the angle between them.

Let's label the airport as point A, the end point of the first leg of the flight as point B, and the final position of the plane as point C.

From the given information, we know that AB = 170 miles and BC = 240 miles. The angle between AB and BC can be determined by subtracting 200°30' from 180°.

180° - 200°30' = 340°30'

Now, we can use the Law of Cosines:

AC² = AB² + BC² - 2 × AB × BC × cos(340°30')

AC² = 170² + 240² - 2 × 170 × 240 × cos(340°30')

Calculating the right-hand side of the equation:

AC² = 28900 + 57600 - 2 × 170 × 240 × cos(340°30')

AC² = 86500 - 81600 × cos(340°30')

Next, let's calculate cos(340°30'). Since cosine is a periodic function, we can use the identity cos(θ) = cos(360° - θ):

cos(340°30') = cos(360° - 340°30')

cos(340°30') = cos(19°30')

Using trigonometric tables or a calculator, we find that cos(19°30') ≈ 0.9423.

Now we can substitute this value back into the equation:

AC² = 86500 - 81600 × 0.9423

AC² = 86500 - 76963.68

AC² ≈ 9536.32

Finally, we can take the square root of both sides to find AC:

AC ≈ √9536.32

AC ≈ 97.64 miles (to the nearest mile)

Therefore, the plane is approximately 97.64 miles away from the airport at this time.