One of the fastest recorded pitches in major-league baseball, thrown by Tim Lincecum in 2009, was clocked at 101.0 mi/h (see the figure). If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?

2.66

It will be 87.12m at the time of 1.93sec

To determine how far the ball would fall vertically by the time it reaches home plate, we need to calculate the time it takes for the ball to travel the horizontal distance and then use that time to calculate the vertical distance fallen.

Step 1: Convert the velocity from mph to ft/s.
To convert from miles per hour (mi/h) to feet per second (ft/s), we need to multiply by a conversion factor.
1 mile = 5280 feet
1 hour = 3600 seconds

So, 101.0 mi/h * (5280 ft/1 mi) * (1/3600 s/1 hr) = 148.27 ft/s.

Step 2: Calculate the time it takes for the ball to reach home plate.
We can use the equation distance = speed * time to find the time.
Since the distance is given as 60.5 ft and the speed is 148.27 ft/s, we can rearrange the equation to solve for time:
Time = Distance / Speed

Time = 60.5 ft / 148.27 ft/s ≈ 0.4084 s (rounded to 4 decimal places).

Step 3: Calculate the vertical distance fallen.
Since the ball is falling vertically, we can use the equation for distance fallen under constant acceleration:
Distance = 1/2 * acceleration * time^2

The acceleration due to gravity is approximately 32.2 ft/s^2.

Distance = 1/2 * (32.2 ft/s^2) * (0.4084 s)^2 ≈ 2.09 ft (rounded to 2 decimal places).

Therefore, the ball would fall vertically approximately 2.09 feet by the time it reaches home plate.

To determine how far the ball would fall vertically by the time it reached home plate, we can use the equations of motion and the principles of projectile motion.

First, let's convert the given velocity from miles per hour (mi/h) to feet per second (ft/s). We know that 1 mile equals 5,280 feet, and 1 hour equals 3,600 seconds:

101.0 mi/h = (101.0 * 5280 ft) / (1 hr * 3600 s) = 147.8667 ft/s (approximately)

Now, we can analyze the vertical motion of the pitched ball. Since the ball is thrown horizontally, the initial vertical velocity (Vy) is 0 ft/s, and the only force acting on the ball in the vertical direction is gravity.

Using the equations of motion, we can calculate the vertical displacement (Δy) of the ball. The equation for vertical displacement is given by:

Δy = Vyi * t + (0.5) * a * t^2

Where Vyi is the initial vertical velocity, t is the time of flight, and a is the acceleration due to gravity (-32.17 ft/s^2).

Since the initial vertical velocity is 0 ft/s, the equation simplifies to:

Δy = (0.5) * (-32.17 ft/s^2) * t^2

Now, we need to find the time it takes for the ball to travel 60.5 ft horizontally. The horizontal velocity (Vx) is equal to 147.8667 ft/s, and the horizontal distance (Δx) is 60.5 ft. The equation for horizontal displacement is given by:

Δx = Vx * t

Rearranging the equation to solve for time (t):

t = Δx / Vx

Substituting the values:

t = 60.5 ft / 147.8667 ft/s ≈ 0.4096 s

Now, we can substitute this value of time into the equation for vertical displacement to find how much the ball would fall vertically:

Δy = (0.5) * (-32.17 ft/s^2) * (0.4096 s)^2

Simplifying the equation:

Δy ≈ - 0.841 ft

The negative sign indicates that the ball falls in the downward direction. So, the ball would fall approximately 0.841 ft vertically by the time it reached home plate, which is 60.5 ft away.