posted by Josh .
I really would like to give some work on this problem, but I have no idea how to do it! Please help!
An airplane is dropping bales of hay to cattle stranded in a blizzard on the Great Plains. The pilot releases the bales at 120 m above the level ground when the plane is flying at 80.0 m/s 55.0 degrees above the horizontal.
How far in front of the cattle should the pilot release the hay so that the bales will land at the point where the cattle are stranded?
Assuming constant velocity and altitude, which is 80.0m/s and 120m, how long does it take for the bales to fall to the ground and how far do they travel horizontally is that time period?
We know that
s=(1/2)gt^2, g=9.8m/s^2, s=120m so 120m=4.9m/s^2 * t^2 Now solve for t
The plane is flying 80.0m/s. How far does bale travel in the time t you just determined?
The location makes an angle of 55deg from the target; the altitude is 120m. What is the horizontal distance of the plane from the target? How does this distance compare with the distance you just calculated?
Ok, the plane is flying at an angle, so has an upward velocity component and a horizontal component.
For the upward component, you get a lift. How long does it take for the bale to hit the ground?
yf=yi + vy*time - 1/2 9.8 time^2
where yf=ground level=0
vy= 80m/s * sin55
solve this second degree equation for time.
Now, having time in the air, how far does the bale travel horizontally?
xf=xi + vh*time
where xi=0, vh= 80cos55, and time is above. xf will be the answer to the question.