An aircraft flies form an airport X (lat35°S,long40°E) and after 1500km due East, it reaches are airport Y. It then flies due South to another airport Z on latitude 70°S . Calculate to 3 s.f :. a) the radius of the latitude through X. b) the longitude of Y c) the speed of Z due to the rotation of the Earth. d) the distance between Y and Z. e) the distance of Z from North pole.?

Where is the answer

a) To calculate the radius of the latitude through airport X, we can use the location coordinates provided. The radius of the Earth is approximately 6,371 km.

The latitude measures the angle between the equatorial plane and a line connecting the point and the center of the Earth. Since airport X is located at latitude 35°S, this means it is 35 degrees south of the equator.

To determine the radius of the latitude through X, we need to calculate the distance from the center of the Earth to the airport along the line of latitude. This can be done using the formula:

Radius of latitude = Radius of the Earth * cosine(latitude)

Substituting the values, we have:

Radius of latitude = 6,371 km * cosine(35°)

Calculating this using a calculator, we get:

Radius of latitude ≈ 5,201.7 km (rounded to 3 s.f.)

Therefore, the radius of the latitude through airport X is approximately 5,201.7 km.

b) To find the longitude of airport Y, note that the aircraft flies due East for 1500 km from airport X. Since longitude measures the angle between the prime meridian and the line connecting the point and the center of the Earth, the longitude of Y remains the same as that of X, which is 40°E.

Therefore, the longitude of airport Y is 40°E.

c) The speed of airport Z due to the rotation of the Earth can be calculated using the formula:

Speed = Distance / Time

In this case, we are not given the time taken to fly from airport Y to airport Z. Therefore, we cannot calculate the speed due to rotation using the information provided.

d) The distance between airport Y and airport Z can be calculated using the given information. The aircraft flies due South from Y to Z, which means it moves along a line of longitude. Using the latitude coordinates, we can calculate the angular difference in the latitudes of Y and Z:

Angular difference = 70°S - 35°S

Angular difference = 35 degrees

Since the radius of the latitude through airport X is approximately 5,201.7 km (calculated in part a), we can calculate the distance between Y and Z using the formula:

Distance = Radius of latitude * angular difference

Distance = 5,201.7 km * (35 degrees)

Calculating this, we get:

Distance ≈ 181,621.95 km (rounded to 3 s.f.)

Therefore, the distance between airport Y and airport Z is approximately 181,621.95 km.

e) To calculate the distance of airport Z from the North Pole, we need to determine the latitude of Z. Given that Z is located at latitude 70°S, this means it is 70 degrees south of the equator.

The distance from the North Pole to any point on the Earth's surface can be calculated using the formula:

Distance from North Pole = 90° - abs(latitude)

Substituting the value, we have:

Distance from North Pole = 90° - abs(70°S)

Calculating this, we get:

Distance from North Pole = 90° - 70°

Distance from North Pole = 20 degrees

Since each degree of latitude is approximately 111 km, we can calculate the distance from the North Pole to Z using the formula:

Distance from North Pole ≈ 20 degrees * 111 km/degree

Calculating this, we get:

Distance from North Pole ≈ 2,220 km

Therefore, the distance of airport Z from the North Pole is approximately 2,220 km.