one of the fastest recorded pitches in major league baseball. thrown by billy wagner in 2003, was clocked at 101.0 mi/h. if a pitch were thrown horizontally with this velocity, how far would the ball fall vetically by the time it reached home plate, 60.5 ft away. Report answer in units of ft.

h=1/2 g t^2, but time is d/v

h=1/2 32 * (60.5/v)^2

v=101mi/hr*88ft/sec /60mph

I get about two and a half feet in my head.

To find out how far the ball would fall vertically by the time it reaches home plate, we can use the equations of motion. We'll assume that the only force acting on the ball is gravity, and neglect air resistance.

First, let's convert the velocity from miles per hour to feet per second, since the ball's fall will be measured in feet. We know that 1 mile is equal to 5280 feet, and 1 hour is equal to 3600 seconds.

So, the velocity of the pitch in feet per second is:
(101.0 mi/h) * (5280 ft/mi) / (3600 s/h) = 147.87 ft/s (rounded to two decimal places)

Next, we can use the formula for the vertical distance traveled (d) in free fall motion:

d = (1/2) * g * t^2

Where:
g is the acceleration due to gravity (32.2 ft/s^2),
t is the time it takes for the pitch to reach home plate.

To find t, we need to calculate the time taken to cover a distance of 60.5 ft. We can use the equation of motion:

d = v * t

Where:
d is the distance (60.5 ft),
v is the velocity (147.87 ft/s),
t is the time taken.

Rearranging the equation, we can solve for t:

t = d / v

Substituting the values, we get:

t = 60.5 ft / 147.87 ft/s = 0.4092 s (rounded to four decimal places)

Now, we can substitute the time (t) into the formula for vertical distance (d):

d = (1/2) * g * t^2

d = (1/2) * (32.2 ft/s^2) * (0.4092 s)^2 = 2.17 ft (rounded to two decimal places)

Therefore, the ball would fall vertically approximately 2.17 feet by the time it reaches home plate, 60.5 feet away.