A plane leaves an airport A (38.5N, 37.05W) and flies dues North to a point B on latitude 52N

(a) Find the distance covered by the plane
(b) The plane then flies due east to a point C, 2400km from B. Determine
the position of C
Take the value � of as 22/7 and radius of the earth as 6370 km

Cir = circumference of earth = 2 pi R = 2 * 22/7 * 6370 km

so
distance north = [(52 - 38.5 ) /360 ] * Cir

B is at latitude 52 N
the radius there to N/S axis is r = 6370 cos 52
we go 2400 km East along that latitude
so angle we fly east = 2400 / (2 * 22/7 * r)* 360
angle west longitude = 37.05 - that angle flown east

To find the distance covered by the plane from airport A to point B, you can make use of the formula for calculating the distance between two points on the Earth's surface using their latitude and longitude. The formula is often referred to as the Haversine formula. Here are the steps to calculate the distance:

(a) Finding the distance from A to B:

Step 1: Convert the latitude and longitude of points A and B from degrees to radians.
- Latitude of A: 38.5N = 38.5° north
- Longitude of A: 37.05W = 37.05° west
- Latitude of B: 52N = 52° north

To convert degrees to radians, you can use the formula:
Radians = Degrees * (π/180)

Converting the latitude and longitude to radians:
- Latitude of A in radians = 38.5 * (π/180) radians
- Longitude of A in radians = -37.05 * (π/180) radians
- Latitude of B in radians = 52 * (π/180) radians

Step 2: Calculate the central angle between points A and B using the Haversine formula.
The Haversine formula is:
hav(c) = hav(φ2 - φ1) + cos(φ1) * cos(φ2) * hav(λ2 - λ1)

Where:
- hav is the haversine function
- φ1 and φ2 are the latitudes of points A and B in radians
- λ1 and λ2 are the longitudes of points A and B in radians

Let's denote the central angle as c.

Step 3: Calculate the distance using the central angle c and the radius of the Earth.
Distance = c * radius of the Earth

Here, the radius of the Earth is given as 6370 km.

Now, let's calculate the distance from A to B:

hav(c) = hav(52 * (π/180) - 38.5 * (π/180)) + cos(38.5 * (π/180)) * cos(52 * (π/180)) * hav(37.05 * (π/180) - (-37.05 * (π/180)))

To calculate hav(x), you can use the formula:
hav(x) = sin²(x/2)

The complete formula for hav(c) becomes:
hav(c) = sin²((52 * (π/180) - 38.5 * (π/180))/2) + cos(38.5 * (π/180)) * cos(52 * (π/180)) * sin²((37.05 * (π/180) - (-37.05 * (π/180)))/2)

Calculate the value of hav(c) and then calculate c using the inverse haversine function:
c = 2 * arcsin(√(hav(c)))

Finally, calculate the distance:
Distance = c * 6370 km

(b) Determining the position of point C:

After reaching point B, the plane flies due east for 2400 km to reach point C. To determine the position of point C, you need to calculate the new longitude of the plane.

Step 1: Find the longitude difference caused by traveling 2400 km east.
Longitude difference = Distance covered east / (Radius of the Earth * cos(latitude))

Step 2: Add the longitude difference to the longitude of point B to get the new longitude of point C.

Position of C:
Latitude of C remains the same as point B (52° north).

Longitude of C = Longitude of B + Longitude difference

Using the above steps, you can determine the position of point C.