A ball is thrown horizontally, with a speed of 20 m/s from the top of a 6.2 m tall hill. How far from the point on the ground directly below the launch point does the ball strike the ground?

To find the horizontal distance the ball travels, we can use the formula for horizontal motion:

d = v * t

where d is the horizontal distance, v is the horizontal velocity, and t is the time of flight.

Given that the ball was thrown horizontally, the initial horizontal velocity is equal to the throwing speed, which is 20 m/s.

In horizontal motion, the acceleration is zero, so the time of flight is the same as the time it takes for the ball to fall vertically from the top of the hill to the ground.

To find the time of flight, we can use the formula for vertical motion:

y = v₀t + 0.5at²

where y is the vertical displacement, v₀ is the initial vertical velocity, t is the time of flight, and a is the acceleration.

The vertical displacement is equal to the height of the hill, which is 6.2 m. The initial vertical velocity is zero since the ball is only thrown horizontally. The acceleration due to gravity, a, is approximately -9.8 m/s², assuming downward as the positive direction.

Plugging these values into the vertical motion formula, we get:

6.2 = 0 + 0.5 * (-9.8) * t²

Simplifying the equation, we get:

3.1 * t² = 6.2

Dividing both sides by 3.1, we find:

t² = 2

Taking the square root of both sides, we get:

t ≈ √2 ≈ 1.41 s

Now we can substitute the value of t into the formula for horizontal motion:

d = v * t

d = 20 m/s * 1.41 s

d ≈ 28.2 m

Therefore, the ball strikes the ground approximately 28.2 meters from the point on the ground directly below the launch point.