One of the fastest recorded pitches in major-league baseball, thrown by Nolan Ryan in 1974, was clocked at 96.8 mi/h.

The acceleration of gravity is 32 ft/s^.
If a pitch were thrown horizontally witha velocity of 96.8 mi/h, how far would the ball fall vertically by the time it reached home plate, 60 ft away? Answer in units of ft.

Xo = 96.8 mi/h * 5280Ft/mi * 1h/3600s =

142 Ft/s.

Dx = Xo*Tf = 60 Ft.
142Ft/s * Tf = 60.
Tf = 0.423 s. = Fall time.

d = 0.5g*Tf^2 = 16*0.423^2 = 2.86 Ft.,
Vertically.

To find the vertical distance the ball falls, we can use the equation of motion:

D = (1/2) * g * t^2,

where D is the distance fallen, g is the acceleration due to gravity, and t is the time it takes for the ball to travel 60 ft horizontally.

First, let's convert the velocity from mph to ft/s:

v = 96.8 mi/h * 5280 ft/mi * 1/3600 h/s ≈ 141.3 ft/s.

Since the pitch is thrown horizontally, the initial vertical velocity (vy) is zero.

Next, we can find the time it takes for the ball to travel 60 ft horizontally:

t = distance / horizontal velocity = 60 ft / 141.3 ft/s ≈ 0.424 s.

Now we can calculate the vertical distance fallen:

D = (1/2) * g * t^2 = (1/2) * (32 ft/s^2) * (0.424 s)^2 ≈ 2.72 ft.

Therefore, the ball would fall approximately 2.72 ft vertically by the time it reaches home plate, 60 ft away.