A plane leaves an airport x,20-60E and 36-80N and flies due south along the same longitude for 8hours at the rate of 1000km/h to another airport Y,20-60E and titaS.the plane then flies west to another airport Z for 8hours at the same speed. Calculate to the nearest degree (a) the value of tita. (b) the longitude of Z

wow - can you make heads or tails of that gibberish? I assume that 22-60 means 22°60'. No idea what "tita" means. Must be "theta".

At any rate, since the circumference of the earth is about 40,000 km, the plane flew 8000km = 1/5 of the way around the earth. That is 360/5 = 72°. Starting from 36°80'N, that puts him at Y=35°20'S.

The length of a line of latitude at an angle x is 40000 cos(x) km. That means that the "circumference" of the earth at 35°20'S is 40000*0.8158 = 32632km.

At that latitude, 8000km is 0.245*360° = 88°15' westward. Starting from 20°60'E, he ends up at longitude 67°15'W.

Hmmm. I just noticed that 22-60E cannot be 22°60', since 60' = 1° so no one would write that. So, fix my misinterpretation and follow the steps to the correct result.

To calculate the value of tita (θ) and the longitude of Z, we can use a combination of trigonometry and coordinate geometry.

(a) Let's focus on finding the value of tita (θ). The plane starts at a latitude of 36-80N and travels due south for 8 hours at a speed of 1000 km/h. Since 1 degree of latitude represents approximately 111 km, we can calculate the change in latitude as follows:

Change in latitude = Speed * Time / Distance per degree of latitude
= 1000 km/h * 8 h / (111 km/degree)
≈ 72 degrees

Since the plane is flying due south, the latitude of the destination airport Y will be 36-80N - 72 = 36-152N. Since the latitude cannot be negative, we convert it to the southern hemisphere. Hence, tita (θ) would be 180° - 36° 42' = 143° 18'.

(b) To determine the longitude of the airport Z, we need to calculate the distance traveled during the 8-hour westward journey. We know the distance traveled can be calculated as:

Distance = Speed * Time
= 1000 km/h * 8 h
= 8000 km

Since the plane was flying due south along the same longitude, the longitude at airport Y (20-60E) will be the same as at airport Z. Therefore, the longitude of airport Z is also 20-60E.

In summary, to the nearest degree:
(a) θ = 143°
(b) The longitude of airport Z is 20-60E.