One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?

how do i do this?

From

Speed * time = distance,
solve for the time, t, to reach home plate (in seconds).

The vertical drop, S, is given by the standard formula
S=ut+(1/2)gt²
u = initial vertical velocity = 0 m/s
g= acceleration due to gravity = -9.8 m/s²
S will come out negative because it is lower than the initial elevation.

To calculate the vertical distance the ball falls by the time it reaches home plate, you can use the equations of motion.

First, you need to determine the time it takes for the ball to reach home plate. To do this, you can use the equation:

distance = velocity × time

Rearranging the equation, you get:

time = distance / velocity

Given that the distance is 60.5 ft (which is equivalent to approximately 18.44 meters) and the velocity is 101.0 mi/h (which is equivalent to approximately 45.148 m/s), you can calculate the time it takes for the ball to reach home plate:

time = 18.44 m / 45.148 m/s

Calculating this, you find that it takes approximately 0.4085 seconds for the ball to reach home plate.

Next, you need to determine the vertical distance the ball falls during this time. To do this, you can use the equation of motion for free fall:

distance = (1/2) × acceleration × time^2

The acceleration due to gravity is approximately 9.8 m/s^2. Plugging in the value for time, you can calculate the vertical distance fallen:

distance = (1/2) × 9.8 m/s^2 × (0.4085 s)^2

Simplifying this expression, you find that the ball falls approximately 0.83 meters (or 2.7 ft) vertically by the time it reaches home plate.

To solve this problem, we need to calculate the vertical distance the ball will fall when thrown horizontally.

First, consider that the horizontal distance the ball travels in this case is given as 60.5 ft. The horizontal velocity (Vx) of the ball remains constant throughout its travel.

Let's now focus on the vertical motion of the ball. We can use the equation of motion:

d = (1/2) * g * t^2

where d is the vertical distance, g is the acceleration due to gravity (32.2 ft/s^2), and t is the time of flight.

Since the ball is thrown horizontally, there is no initial vertical velocity. Hence, the time of flight can be calculated using the horizontal distance (60.5 ft) and the horizontal velocity (101.0 mi/h).

First, let's convert the velocity from miles per hour to feet per second since all other variables are in feet:

Vx = (101.0 mi/h) * (5280 ft/mi) / (3600 s/h) = 147.9 ft/s

Now, we can calculate the time:

t = d / Vx

t = 60.5 ft / 147.9 ft/s

t = 0.409 s (rounded to three decimal places)

Next, substitute the calculated time into the equation of motion to find the vertical distance:

d = (1/2) * g * t^2

d = (1/2) * (32.2 ft/s^2) * (0.409 s)^2

d = 2.099 ft (rounded to three decimal places)

Therefore, the ball would fall vertically by approximately 2.099 ft before reaching home plate when thrown horizontally with a velocity of 101.0 mi/hr.