The fastest major league pitcher throws a ball at 41.0 m/s .

If he throws the ball horizontally, how far does it drop vertically on the 19.7 m trip to home plate?

t = 19.7/41

h = 0 + 0 - 4.9 t^2

To determine how far the ball drops vertically during the 19.7 m trip to home plate, we can make use of the kinematic equations of motion. The formula we will use is:

h = (1/2) * g * t^2

Where:
h is the vertical displacement (how far the ball drops)
g is the acceleration due to gravity (approximately 9.8 m/s^2)
t is the time taken for the ball to reach home plate

First, we need to find the time taken for the ball to cover a horizontal distance of 19.7 m. Since the ball is thrown horizontally, the vertical motion is independent of the horizontal motion. Therefore, we can use the horizontal distance and the initial velocity to find the time.

The formula to calculate time is:

t = d / v

Where:
d is the horizontal distance (19.7 m)
v is the horizontal velocity (41.0 m/s)

So, substituting the given values:

t = 19.7 / 41.0
t ≈ 0.48 s

Now that we have the time, we can use it to calculate the vertical displacement:

h = (1/2) * g * t^2
h = (1/2) * 9.8 * (0.48^2)
h ≈ 0.56 m

Therefore, the ball would drop approximately 0.56 meters vertically during the 19.7 meter trip to home plate.