an airplane flying at an altitude of 10,000ft or 1.89 miles. The airport at which it is schooled to land is 50 mi away. Find the average angle at which the airplane must descent for landing round your answer to the nearest degree.

tanθ = 1.89/50

2 degrees

To find the average angle at which the airplane must descend for landing, we can use basic trigonometry. The tangent of an angle is equal to the opposite side divided by the adjacent side. In this case, the angle we want to find is the angle between the airplane's altitude and the horizontal distance it must cover to reach the airport.

First, we need to find the opposite and adjacent sides of the right triangle formed by the altitude and the horizontal distance. The opposite side is the altitude of the airplane (1.89 miles or 10,000 ft) and the adjacent side is the horizontal distance to the airport (50 miles).

Using the tangent function, we can calculate the angle:

tan(angle) = opposite/adjacent
tan(angle) = (1.89 miles)/(50 miles)
tan(angle) ≈ 0.0378

Now, to find the angle itself, we can take the inverse tangent (arctan) of 0.0378:

angle ≈ arctan(0.0378)
angle ≈ 2.16 degrees

Therefore, the average angle at which the airplane must descend for landing is approximately 2 degrees.