A ball Is thrown horizontally on the top of a 20km high cliff if the initial horizontal

velocity of the ball is 10 m/s, how far from the base of the cliff will the ball land?

To solve this problem, we can use the equation for horizontal distance traveled by an object in projectile motion:

d = v * t

where d is the horizontal distance, v is the horizontal velocity, and t is the time of flight.

Since the ball is thrown horizontally, its initial vertical velocity is 0 m/s. The time of flight can be determined using the equation for vertical displacement:

d = (1/2) * g * t^2

where g is the acceleration due to gravity (approximately 9.8 m/s^2).

In this case, the vertical displacement is the height of the cliff (20 km = 20,000 m). Thus, we can rearrange the equation:

20,000 = (1/2) * 9.8 * t^2

Simplifying, we get:

40,000 = 9.8 * t^2

Dividing both sides by 9.8, we have:

t^2 = 40,000 / 9.8
t^2 ≈ 4081.63

Taking the square root of both sides, we find:

t ≈ √4081.63
t ≈ 63.85 seconds

Now that we have the time of flight, we can calculate the horizontal distance using the equation d = v * t:

d = 10 * 63.85
d ≈ 638.5 meters

Therefore, the ball will land approximately 638.5 meters from the base of the cliff.