One of the fastest recorded pitches major-league baseball, thrown by Nolan Ryan in 1974, was clocked at 100.8 mi/hr. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate 60.0 ft away?

To determine how far the ball would fall vertically by the time it reached home plate, we can use the principles of projectile motion and calculate the vertical displacement.

Here's how you can do it:

1. Convert the velocity from miles per hour (mi/hr) to feet per second (ft/s). Since 1 mile is equal to 5280 feet and 1 hour is equal to 3600 seconds, divide 100.8 by 3600 and then multiply by 5280 to get the velocity in ft/s.

Velocity = (100.8 mi/hr) * (5280 ft/mi) / (3600 s/hr)

2. Calculate the time it takes for the ball to reach home plate. Since we know the horizontal distance (60.0 ft) and the horizontal velocity (which is constant throughout the motion), we can use the formula:

Time = Distance / Velocity

Substituting the values, Time = 60.0 ft / Velocity

This will give us the time it takes for the ball to travel from the pitcher to home plate.

3. Now, using the equation of motion for free-falling objects, we can find the vertical displacement (fall) during this time. The equation is:

Fall = (1/2) * g * Time^2

Where "g" is the acceleration due to gravity, which is approximately 32.2 ft/s^2.

Substitute the value of Time and calculate the fall.

4. The calculated fall gives us the vertical distance the ball drops during its horizontal motion. This is the distance it would fall vertically by the time it reaches home plate.

5. Round the answer to an appropriate number of digits based on the precision of the data given.

By following these steps, you can determine how far the ball would fall vertically by the time it reaches home plate.