A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 0.995 m (3.26 ft) by the time it reaches home plate 18.3 m (60 ft) away. How fast was the pitcher's pitch?

To calculate the speed of the pitcher's pitch, we can use the principles of projectile motion. The horizontal distance traveled by the ball (18.3 m) and the vertical displacement (0.995 m) can be used to find the initial velocity of the ball.

First, let's assume that air resistance is negligible. This allows us to separate the horizontal and vertical motions of the ball.

For horizontal motion, the ball travels a distance of 18.3 m. We can use the equation:

Distance = Velocity × Time

Since the ball is thrown horizontally, the initial vertical velocity is zero. Therefore, the time taken to cover 18.3 m horizontally is the same as the time taken for the ball to fall 0.995 m vertically.

Now, let's calculate the time taken:

Vertical displacement = (1/2) × Acceleration × Time²

0.995 m = (1/2) × (9.8 m/s²) × Time²

0.995 m = 4.9 m/s² × Time²

Simplifying, we find:

Time² = (0.995 m) / (4.9 m/s²)
Time² = 0.203 sec²

Taking the square root of both sides:

Time = √(0.203 sec²)
Time ≈ 0.451 sec

Since the ball traveled a distance of 18.3 m horizontally, the velocity can be calculated using the equation:

Velocity = Distance / Time
Velocity = (18.3 m) / (0.451 sec)
Velocity ≈ 40.6 m/s

Therefore, the pitcher's pitch was approximately 40.6 meters per second.