A major-league pitcher can throw a baseball in excess of 40.2 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 17.7 m away from the point of release?

Time to reach catcher = T = 17.7/40.2 = 0.440 seconds

Amoount of drop = (1/2)*g*T^2 = 0.95 meter

The ball can drop even more if a "sinker" is thrown. This involves aerodynamic forces that are not considered here.

To determine how much the ball will drop by the time it reaches the catcher, we need to understand the relationship between the horizontal and vertical components of the ball's motion.

When a ball is thrown horizontally, its initial velocity is entirely in the horizontal direction. This means the vertical component of its velocity is initially zero. However, due to the acceleration due to gravity, the ball will also start to fall vertically as it moves horizontally.

First, let's analyze the horizontal motion. We know that the ball is thrown horizontally at a speed of 40.2 m/s, and it travels a distance of 17.7 m. Since there is no horizontal acceleration (assuming no air resistance), the time taken to cover this distance can be calculated using the formula:

time = distance / horizontal velocity

time = 17.7 m / 40.2 m/s

Calculating this gives us: time = 0.439 s (rounded to three decimal places).

Since the horizontal motion is constant, we know that the ball will reach the catcher after 0.439 seconds

Now let's focus on the vertical motion. The ball starts with an initial vertical velocity of zero, and the only acceleration acting on it is due to gravity, which is approximately 9.8 m/s².

Using the equation of motion for vertical motion:

distance = (1/2) * acceleration * time²

Substituting the values we have:

distance = (1/2) * 9.8 m/s² * (0.439 s)²

distance = 0.975 m (rounded to three decimal places)

Therefore, the ball will drop by approximately 0.975 meters by the time it reaches the catcher.