A major-league pitcher can throw a baseball in excess of 48.3 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 15.3 m away from the point of release?

To determine how much the ball will drop by the time it reaches the catcher, we need to calculate the vertical distance covered during the horizontal travel.

To do this, we can use the kinematic equation for vertical displacement, assuming constant acceleration due to gravity:

Δy = Vyi * t + (1/2) * a * t^2

Since the ball is thrown horizontally, there is no vertical initial velocity (Vyi = 0).

The only vertical acceleration acting on the ball is due to gravity (a = -9.8 m/s^2). However, since the ball is thrown horizontally, the vertical acceleration does not affect the horizontal motion.

Now, we need to calculate the time it takes for the ball to reach the catcher. Since we know the horizontal distance covered (15.3 m) and the initial horizontal velocity (48.3 m/s), we can use the formula:

Δx = Vxi * t

Rearranging the equation, we get:

t = Δx / Vxi

Substituting the values, we have:

t = 15.3 m / 48.3 m/s

Calculating the time, we find:

t ≈ 0.316 s

Now, we can find the vertical displacement:

Δy = 0 * 0.316 s + (1/2) * (-9.8 m/s^2) * (0.316 s)^2

Simplifying the equation, we get:

Δy ≈ -0.487 m

Since the acceleration due to gravity is negative, the negative sign indicates that the ball drops below the initial height.

Therefore, the ball will drop approximately 0.487 meters by the time it reaches the catcher.