One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?

___ft

2.69 feet

To determine how far the ball would fall vertically by the time it reaches home plate, we can use the equations of motion.

The horizontal distance traveled by the ball (60.5 ft) remains the same since the pitch is thrown horizontally. We need to find the vertical displacement or the distance the ball falls.

First, let's convert the velocity from miles per hour to feet per second. Since 1 mile is equal to 5280 feet and 1 hour is equal to 3600 seconds, we can convert 101.0 mi/h to ft/s:

101.0 mi/h * 5280 ft/1 mi * 1 h/3600 s = 147.933 ft/s (rounded to three decimal places)

Now, we can use the kinematic equation for vertical displacement in the absence of air resistance:

d = ut + (1/2)gt^2

Where:
d = vertical displacement (unknown)
u = initial velocity in the vertical direction (0 ft/s since the pitch is thrown horizontally)
t = time of flight (unknown)
g = acceleration due to gravity, approximately 32.174 ft/s^2

Since we want to find the vertical distance the ball falls by the time it reaches home plate, we take t as the time it takes for the ball to travel 60.5 ft horizontally.

t = 60.5 ft / 147.933 ft/s ≈ 0.409 s (rounded to three decimal places)

Now we can substitute the values into the equation:

d = 0 + (1/2) * 32.174 ft/s^2 * (0.409 s)^2

d ≈ 0.838 ft (rounded to three decimal places)

Therefore, the ball would fall vertically by approximately 0.838 feet (or 10.06 inches) by the time it reaches home plate.