posted by Anonymous .
A major-league pitcher can throw a baseball in excess of 40.5 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 17.7 m away from the point of release?
To a certain extent, the pitcher can control how far the ball drops, or even briefly rises, by varying the top spin that he gives to the ball. They expect you to ignore aerodynamic effects, however.
The time it takes the ball to reach the catcher is
T = 17.7/40.5 = 0.437 s
Next, compute how far the ball will drop due to gravity during that time. Remember that the initial horizontal velocity component is zero.
Use the formula
change in y = (g/2) T^2