A plane leaves airport A and travels 540 miles to airport B on a bearing of N34^E. The plane later leaves airport B and travels to airport C 390 miles away on a bearing of S74^E . Find the distance from airport A to airport C to the nearest tenth of a mile

break the two trips into their N, E, and S components

combine the components to get the components of A to C

use Pythagoras to find the distance

To find the distance from airport A to airport C, we can break down the distance into two components: the north-south and the east-west distances.

Let's start by finding the north-south distance. The plane travels 540 miles from airport A to airport B on a bearing of N34°E. This means that the north-south component of the distance is given by:

north-south distance = 540 * sin(34°)

Using a calculator, we can find that the north-south distance is approximately 313.8 miles.

Next, let's find the east-west distance. The plane travels 390 miles from airport B to airport C on a bearing of S74°E. This means that the east-west component of the distance is given by:

east-west distance = 390 * cos(74°)

Using a calculator, we can find that the east-west distance is approximately 124.6 miles.

Now, we can use the Pythagorean theorem to find the total distance from airport A to airport C:

distance^2 = (north-south distance)^2 + (east-west distance)^2

distance^2 = 313.8^2 + 124.6^2

distance^2 ≈ 98335.44 + 15517.16

distance^2 ≈ 113852.6

distance ≈ √113852.6

distance ≈ 337.8 miles (rounded to the nearest tenth)

Therefore, the distance from airport A to airport C is approximately 337.8 miles.