A ball is thrown horizontally at 20 m/s from the top of a cliff 50 meters high. How far from the base of the cliff does the ball strike the ground?

To find the horizontal distance at which the ball strikes the ground, we can use the equation of motion and the fact that the initial vertical velocity is zero.

We know that the ball is thrown horizontally, which means its initial vertical velocity is zero. The only force acting on the ball in the horizontal direction is the initial velocity of 20 m/s.

Using the equation of motion:

distance = initial velocity × time + (1/2) × acceleration × time^2

Since the vertical velocity is zero, we only need to consider the vertical component of the equation:

height = (1/2) × acceleration × time^2

In this case, the height is 50 meters and the acceleration due to gravity is 9.8 m/s^2. We can rearrange the equation to solve for time:

time = sqrt(2 × height / acceleration)

Plugging in the values:

time = sqrt(2 × 50 / 9.8) ≈ 3.19 seconds

Now that we have the total time of flight, we can find the horizontal distance traveled by the ball using the equation:

horizontal distance = initial velocity × time

Plugging in the values:

horizontal distance = 20 m/s × 3.19 s ≈ 63.8 meters

Therefore, the ball strikes the ground approximately 63.8 meters from the base of the cliff.

njl