A ball rolls off a 1.40 m high table with an initial horizontal speed

of 0.600 m/s. How far will it land?

To determine how far the ball will land, we can use the kinematic equations of motion.

In this scenario, the initial vertical velocity is zero as the ball rolls off the table horizontally. The acceleration due to gravity, denoted as "g," is approximately 9.8 m/s².

We want to find the horizontal distance the ball will cover before hitting the ground. Let's denote this distance as "d."

Using the kinematic equation for the horizontal direction, we have:

d = (initial horizontal velocity) * (time of flight)

To find the time of flight (the time it takes for the ball to hit the ground), we can use the kinematic equation for the vertical direction:

vertical displacement = (initial vertical velocity) * (time) + (0.5) * (acceleration) * (time)²

The vertical displacement is the height of the table (1.40 m), the initial vertical velocity is 0 m/s, and the acceleration is -9.8 m/s² (negative due to it being directed downwards). We can rearrange the equation to solve for time:

1.40 m = 0.5 * (-9.8 m/s²) * (time)²

Simplifying further, we have:

(time)² = (-2 * 1.40 m) / (-9.8 m/s²)

(time)² = 0.2857 s²

Taking the square root of both sides, we find:

time = √(0.2857 s²) ≈ 0.5357 s

Now that we have the time of flight, we can calculate the horizontal distance:

d = (0.600 m/s) * (0.5357 s) ≈ 0.3214 m

Therefore, the ball will land approximately 0.3214 meters away from the edge of the table.

h = 0.5g*t^2

h = 1.40 m
g = +9.8 m/s^2
Solve for t(Fall time).

Xo = 0.600 m/s

Dx = Xo * t