A plane is flying in a straight line at an unchanging height and unchanging speed. The pilot must

drop an object onto a target in front of the plane. At what angle must the pilot see the target at the
moment of dropping the object? At that moment, what is the distance between the target and the
plane’s projection onto the ground? Disregard air resistance.

To determine the angle at which the pilot must see the target and the distance between the target and the plane's projection onto the ground, we can break down the problem into two parts:

1. Finding the angle:
To determine the angle, we can use trigonometry. Let's assume that the plane is flying at a height 'h' and the target is at a horizontal distance 'd' from the plane.
We can use the tangent function to find the angle:

tan(θ) = h / d

Rearranging the equation, we get:

θ = arctan(h / d)

Where:
θ = the angle the pilot must see the target at,
h = height of the plane, and
d = horizontal distance between the target and the plane.

2. Finding the distance between the target and the plane's projection:
To find the distance between the target and the plane's projection onto the ground, we need to consider the horizontal distance traveled by the plane while the object falls. This distance is given by the equation:

distance = speed × time

Since the plane is flying at a constant speed, the time it takes for the object to fall is the same as the time it takes for the plane to travel the horizontal distance 'd'. Let's denote this time as 't'.

Therefore, the distance between the target and the plane's projection onto the ground is simply the horizontal distance traveled by the plane during time 't', which is 'd'.

In summary, to solve the problem:
1. Find the height 'h' and horizontal distance 'd' between the plane and the target.
2. Use the arctan(h / d) to find the angle at which the pilot must see the target.
3. The distance between the target and the plane's projection onto the ground is 'd'.