A plane flying at 200 knots left an airport A( 30° S, 31°E) and flew due North to an airport B( 30° N 31° E)

(a) Calculate the distance covered by the plane, in nautical miles
(b) After a 15 minutes stop over B, the plane flew west to an airport C(30°N 13°E) at the same speed.

since both A and B are at the same longitude, they are separated by 60° latitude = π/3

the distance s = Rθ

where R is the radius of the earth.

If the plane flies to C along the 30° latitude line, it flies along a circle with radius Rcos30°, through an angle of 18° = π/10

To calculate the distance covered by the plane, we can use the formula for distance on a sphere, known as the great circle distance.

(a) Distance covered from airport A to airport B:

Step 1: Convert the coordinates from degrees to radians.
Airport A: Latitude = -30° = -30° * π/180 radians = -0.5236 radians
Longitude = 31° = 31° * π/180 radians = 0.5236 radians

Airport B: Latitude = 30° = 30° * π/180 radians = 0.5236 radians
Longitude = 31° = 31° * π/180 radians = 0.5236 radians

Step 2: Calculate the spherical distance using the formula:
Distance = R * arccos(sin(latitude_A) * sin(latitude_B) + cos(latitude_A) * cos(latitude_B) * cos(longitude_A - longitude_B))

Where R is the radius of the Earth (approximately 3437.74677 nautical miles).

Distance_A_B = 3437.74677 * arccos(sin(-0.5236) * sin(0.5236) + cos(-0.5236) * cos(0.5236) * cos(0.5236 - 0.5236))
= 3437.74677 * arccos(0 + cos(-0.5236) * cos(0.5236) * cos(0))
= 3437.74677 * arccos(cos(-0.5236) * cos(0.5236))
= 3437.74677 * arccos(cos^2(0.5236))
= 3437.74677 * arccos(0.828)

The distance covered from airport A to airport B is approximately 2,341.0791 nautical miles.

(b) Distance covered from airport B to airport C:

Step 1: Convert the coordinates from degrees to radians.
Airport C: Latitude = 30° = 30° * π/180 radians = 0.5236 radians
Longitude = 13° = 13° * π/180 radians = 0.2269 radians

Step 2: Calculate the spherical distance using the formula:
Distance = R * arccos(sin(latitude_B) * sin(latitude_C) + cos(latitude_B) * cos(latitude_C) * cos(longitude_B - longitude_C))

Distance_B_C = 3437.74677 * arccos(sin(0.5236) * sin(0.5236) + cos(0.5236) * cos(0.5236) * cos(0.5236 - 0.2269))
= 3437.74677 * arccos(0.619)

The distance covered from airport B to airport C is approximately 1,968.3526 nautical miles.