A pitcher throws horizontally a fast ball at 156 km/h toward the home plate, which is 18.2 m away. Neglecting air resistance (not a good idea if you are the batter), find how far the ball drops because of gravity by the time it reaches the home plate.

I did the arrow one, now you do some.

To find how far the ball drops due to gravity by the time it reaches the home plate, we need to calculate the time it takes for the ball to travel the 18.2 m distance.

Step 1: Convert speed from km/h to m/s.
156 km/h = (156 * 1000) m/ (60 * 60) s ≈ 43.33 m/s

Step 2: Use horizontal velocity to find the time of flight.
Since the ball is thrown horizontally, the horizontal velocity remains constant throughout the motion. The formula to calculate time (t) is: distance (d) = velocity (v) * time (t).
18.2 m = 43.33 m/s * t

Solving for t:
t = 18.2 m / 43.33 m/s ≈ 0.421 s

Step 3: Use the time of flight to find the vertical displacement due to gravity.
We can use the formula for vertical displacement (d) in free fall due to gravity: d = 0.5 * g * t^2, where g is the acceleration due to gravity (approximately 9.8 m/s^2).

d = 0.5 * 9.8 m/s^2 * (0.421 s)^2
d ≈ 0.868 m

Therefore, the ball drops approximately 0.868 meters due to gravity by the time it reaches the home plate.