A major-league pitcher can throw a ball in excess of 44.4 m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches a catcher who is 17.0 m away from the point of release?

To solve this problem, we can use the principles of projectile motion. When a ball is thrown horizontally, it does not have any vertical velocity, only horizontal velocity. Therefore, the only force acting on the ball in the vertical direction is gravity.

We know that the initial vertical position, y(initial), is 0 since the ball is released horizontally. The horizontal velocity, v(horizontal), is 44.4 m/s, and the total horizontal distance traveled, x, is 17.0 m. The acceleration due to gravity, g, is approximately 9.8 m/s². We need to calculate the vertical distance traveled, which is the amount the ball drops, y(drop).

The equation we can use to find y(drop) is:
y(drop) = 1/2 * g * t²,
where t is the time it takes for the ball to travel 17.0 m horizontally.

To find t, we can use the equation:
x = v(horizontal) * t.
Rearranging this equation to solve for t, we have:
t = x / v(horizontal).

Now we can substitute the given values:
t = 17.0 m / 44.4 m/s ≈ 0.383 seconds.

Finally, plugging this value of t into the equation for y(drop), we get:
y(drop) = 1/2 * 9.8 m/s² * (0.383 s)² ≈ 0.705 meters.

Therefore, the ball will drop approximately 0.705 meters by the time it reaches the catcher who is 17.0 meters away from the point of release.