A major-league pitcher can throw a baseball in excess of 41.3m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches a catcher who is 15.2m away from the point of release?

t = time = distance/speed = 15.2/41.3

drop = (1/2)g t^2
= 4.9 t^2

To determine how much the ball will drop by the time it reaches the catcher, we need to consider the effect of gravity on the vertical motion of the ball.

First, let's find out how long it takes for the ball to reach the catcher. We can use the formula for horizontal distance traveled:

distance = speed × time

In this case, the distance is 15.2m and the speed is 41.3m/s. Solving for time, we get:

time = distance / speed

time = 15.2m / 41.3m/s

time ≈ 0.368 seconds

Next, we can calculate how much the ball drops during this time based on the effect of gravity. The vertical distance the ball drops during free fall can be calculated using the formula:

distance = (1/2) × acceleration due to gravity × time²

In this case, the acceleration due to gravity is approximately 9.81m/s², and the time is 0.368 seconds. Plugging in these values, we have:

distance = (1/2) × 9.81m/s² × (0.368 seconds)²

distance ≈ 0.642m

Therefore, by the time the ball reaches the catcher who is 15.2m away, it will drop approximately 0.642m.