A supply plane needs to drop a package of food to scientists working on a glacier in Greenland. The plane flies 200 above the glacier at a speed of 200. How far short of the target should it drop the package?

To determine how far short the supply plane should drop the package, we need to consider the horizontal distance the plane covers during the time it takes for the package to reach the scientists on the ground.

First, let's calculate the time it takes for the package to reach the ground. We know that the plane is flying at a speed of 200, so the package will also have the same horizontal velocity. Additionally, we know that the package drops from an altitude of 200. We can use the equation of motion for vertical motion to calculate the time it takes for the package to fall to the ground.

The equation for vertical motion is given by:

s = ut + (1/2)at^2

Where:
- s is the displacement (change in vertical position),
- u is the initial vertical velocity,
- t is time, and
- a is the acceleration due to gravity.

In this case, the initial vertical velocity (u) is 0 since the package initially drops from rest. The acceleration due to gravity (a) is approximately 9.8 m/s^2.

Substituting the known values into the equation:

200 = 0*t + (1/2)*9.8*t^2

Simplifying the equation:

98t^2 = 200

Dividing both sides of the equation by 98:

t^2 = 200/98

t^2 ≈ 2.04

Taking the square root of both sides to solve for t:

t ≈ √2.04

t ≈ 1.43 seconds

Therefore, it takes approximately 1.43 seconds for the package to reach the scientists on the ground.

Now, let's calculate the horizontal distance covered by the plane during this time. We need to multiply the horizontal velocity of 200 by the time of flight.

Distance = Velocity * Time
Distance = 200 m/s * 1.43 s
Distance ≈ 286 meters

So, the supply plane should drop the package approximately 286 meters short of the target to account for the horizontal distance the plane covers during the package's time of flight.