A major-league pitcher can throw a ball in excess of 37.8 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches a catcher who is 17.0 m away from the point of release?

To find out how much the ball will drop by the time it reaches the catcher, you need to calculate the vertical displacement of the ball.

First, let's break down the motion of the ball into its horizontal and vertical components. Since the ball is thrown horizontally, there is no initial vertical velocity. The only force acting on the ball vertically is the force of gravity.

The formula for vertical displacement under constant acceleration is given by the equation:
d = v₀t + 0.5at²

In this case, the initial vertical velocity, v₀, is 0 because the ball is thrown horizontally. The acceleration, a, is the acceleration due to gravity, which is approximately 9.8 m/s². The time, t, can be found using the horizontal component of the motion.

Since the ball is thrown horizontally, its horizontal velocity, v, remains constant throughout the motion. The formula for horizontal displacement is given by:
d = vt

In this case, the horizontal velocity, v, is 37.8 m/s and the horizontal displacement, d, is 17.0 m. Rearranging the formula, we can solve for time:
t = d / v

Now we can substitute the value of time, t, into the formula for vertical displacement to find the answer. The vertical displacement, d, can be calculated as follows:
d = 0.5at²

Substituting the values, we have:
d = 0.5 * 9.8 * (d / v)²

Simplifying further:
d = 0.5 * 9.8 * (17.0 / 37.8)²

Evaluating the expression:
d = 0.5 * 9.8 * 0.4747²

d ≈ 0.5 * 9.8 * 0.2257

d ≈ 1.1035 m

Therefore, the ball will drop approximately 1.1035 meters by the time it reaches the catcher who is 17.0 meters away from the point of release.

V*t = 17 m.

t = 17/V = 17/37.8 = 0.1852 s.

d = 0.5g*t^2.
g = 9.8 m/s^2.