A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 0.955 m (3.13 ft) by the time it reaches home plate 18.3 m (60 ft) away. How fast was the pitcher's pitch?

18.96

To determine the speed of the pitcher's pitch, we can use the kinematic equation for horizontal motion.

The equation is:

d = v * t

where:
d is the horizontal distance traveled (18.3 m)
v is the velocity or speed of the ball
t is the time it takes for the ball to travel the distance

In this case, we are given the distance and we need to find the speed. We also know that the ball falls 0.955 m during its travel time.

To calculate the time it takes for the ball to travel, we can use the vertical motion equation:

h = (1/2) * g * t^2

where:
h is the vertical distance fallen (0.955 m)
g is the acceleration due to gravity (9.8 m/s^2)
t is the time of flight

Rearranging this equation to solve for t gives us:

t = sqrt(2h / g)

Plugging in the given values, we get:

t = sqrt(2 * 0.955 / 9.8) = 0.43 seconds

Now that we have the time, we can substitute it into the horizontal motion equation to find the speed:

v = d / t = 18.3 / 0.43 ≈ 42.6 m/s

Therefore, the speed of the pitcher's pitch is approximately 42.6 m/s.