A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 0.995 m (3.26 ft) by the time it reaches home plate 18.3 m (60 ft) away. How fast was the pitcher's pitch?

To determine the speed of the pitcher's pitch, we need to use the equation for horizontal motion:

distance (d) = velocity (v) × time (t)

In this case, the distance the ball traveled horizontally is given as 18.3 m (60 ft), and the time it took is the same as the time it takes for an object to fall 0.995 m (3.26 ft) vertically. We can use the equation for the vertical motion to find the time taken:

vertical distance (y) = (1/2) × acceleration due to gravity (g) × time squared (t^2)

Given that the vertical distance is 0.995 m (3.26 ft), and the acceleration due to gravity is approximately 9.8 m/s^2, we can solve for t:

0.995 m = (1/2) × 9.8 m/s^2 × t^2
1.99 m = 9.8 m/s^2 × t^2
t^2 = 1.99 m / (9.8 m/s^2)
t^2 = 0.2031
t ≈ √0.2031
t ≈ 0.45 seconds

Now that we have the time taken, we can use the horizontal distance and time to find the velocity:

18.3 m = v × 0.45 s
v = 18.3 m / 0.45 s
v ≈ 40.67 m/s

Thus, the speed of the pitcher's pitch is approximately 40.67 m/s.