A baseball pitcher throws a ball horizontally under negligible air resistance. The ball falls 83cm in travelling 18.4 m to the home plate. Determine the ball's initial horizontal speed.

How much time does it take for anything to fall .83meters?

That is the same time it took to travel the distance from the home plate.

To find the initial horizontal speed of the ball, we can use the equation for horizontal distance traveled:

d = v_initial * t

where:
- d is the horizontal distance traveled (18.4 m)
- v_initial is the initial horizontal velocity
- t is the time the ball takes to reach the home plate

We are given that the ball falls 83 cm while traveling 18.4 m. The vertical distance traveled is not affected by the horizontal velocity, so we can use this information to find the time it took for the ball to reach the home plate.

First, we need to convert the vertical distance from centimeters to meters:
vertical_distance = 83 cm = 0.83 m

Next, we can use the equation for the vertical distance traveled by an object under free fall:

vertical_distance = (1/2) * g * t^2

where:
- g is the acceleration due to gravity (approximately 9.8 m/s^2)
- t is the time

Rearranging the equation, we can solve for the time (t):

t^2 = (2 * vertical_distance) / g

t = sqrt((2 * 0.83) / 9.8)
t ≈ sqrt(0.1694) ≈ 0.4119 s

Now that we have the time, we can substitute it back into the equation for horizontal distance traveled to find the initial horizontal velocity:

18.4 m = v_initial * 0.4119 s

Solving for v_initial:

v_initial = 18.4 m / 0.4119 s
v_initial ≈ 44.63 m/s

Therefore, the ball's initial horizontal speed is approximately 44.63 m/s.