One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h . If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?

I tried to convert from miles to feet, but I got huge numbers that I couldn't work with and I tried to find the time when it fell so that I could fine the vertical distance, but to no avail. I have no idea how to start or do this problem. It seems really simple though.

I get 2.64 feet.

You are right Elisa. I get 2.66 ft.

101mph = (101/60)88 = 148.13 fps.

60.5/148.13 = .408 sec.

h = 16t^2 = 15(.408^2) = 2.66 feet.

Well, just like a clown attempting to throw a pie horizontally but accidentally dropping it, let's see if we can calculate how far the ball would fall vertically!

First, let's convert the velocity from miles per hour to feet per second, because that's how we roll in the world of physics. There are 5,280 feet in a mile and 3,600 seconds in an hour, so multiplying 101.0 mi/h by 5280 ft/mi and dividing by 3600 s/h, we get approximately 148.13 ft/s.

Now, we can determine the time it takes for the ball to travel the 60.5 ft distance. Using the equation distance = rate × time, we can rearrange it to time = distance / rate. Substituting the values, we find that the time taken is roughly 0.41 seconds.

Since we know the horizontal distance and the time taken, we can now calculate the vertical distance the ball falls using the equation distance = 0.5 × acceleration × time squared.

Now, here's the part where things get a little hairy. I don't actually know the acceleration, because I'm not sure if gravity has taken any coffee breaks during this baseball situation. However, we can assume that the acceleration is the same as Earth's gravity, which is about 32.2 ft/s².

Plugging in the known values, we can find that the vertical distance fallen is approximately 0.5 × 32.2 ft/s² × (0.41 s)² ≈ 2.11 feet.

So, by the time the ball reaches home plate, it would fall vertically by around 2.11 feet. However, keep in mind that this calculation assumes no air resistance, no spin, and a perfectly horizontal trajectory, just like a clown's attempt to juggle without dropping anything.

To solve this problem, you can use the equations of motion in Physics. One of these equations relates time, initial velocity, distance, and acceleration.

The equation you will use is:

distance = initial velocity * time + (1/2) * acceleration * time^2

In this case, the initial velocity is the horizontal velocity of the pitch, which is given as 101.0 mi/h. However, we need to convert this velocity to feet per second, since the distances are given in feet.

To do this, we can use the conversion factor of 1 mile = 5280 feet and 1 hour = 3600 seconds:

101.0 mi/h * (5280 ft/ 1 mi) * (1 h/ 3600 s) = 147.93333 ft/s (approximately)

So, the horizontal velocity is approximately 147.93333 ft/s.

Now, let's find the time it takes for the pitch to travel the 60.5 ft distance to the home plate.

Using the equation:

distance = initial velocity * time

we can rearrange it to solve for time:

time = distance / initial velocity

Substituting the known values:

time = 60.5 ft / 147.93333 ft/s ≈ 0.4093 s

Now let's calculate the vertical distance the ball falls during this time.

Assuming a constant vertical acceleration of -32.17 ft/s^2 (acceleration due to gravity), we can use the equation:

distance = (1/2) * acceleration * time^2

Substituting the known values:

distance = (1/2) * -32.17 ft/s^2 * (0.4093 s)^2 ≈ -2.0514 ft

Since vertical distance is measured downwards, the ball falls approximately 2.0514 ft vertically by the time it reaches home plate.

Note: The negative sign indicates downward motion because we have taken upward as positive.

Long ago I memorized a conversion factor: 60mph = 88ft/sec

time to home plate: = 60.5ft/velocity

Now put that time into
distance=1/2 g t^2 where g us 33ft/s^2

I get about six to seven feet.