a stone is thrown horizontally from the edge of the cliff with speed of 18m/sec. Cliff is 50 meters high

Incomplete.

To find out how far the stone will travel horizontally before hitting the ground, we can use the equations of motion.

First, let's analyze the stone's motion vertically. Since the stone was thrown horizontally, there is no initial vertical velocity. The only force acting on the stone vertically is gravity, which causes it to accelerate downwards at 9.8 m/s^2.

We can use the equation:
h = ut + (1/2)gt^2

where:
h is the height of the cliff
u is the initial vertical velocity (which is zero)
g is the acceleration due to gravity (9.8 m/s^2)
t is the time it takes for the stone to hit the ground

Plugging in the values we know:
50 = 0 + (1/2)(9.8)t^2

Simplifying the equation gives us:
25 = 4.9t^2

Now, we can solve for time, t:
t^2 = 25/4.9
t ≈ √(25/4.9)
t ≈ 2.04 seconds

Now that we know it takes approximately 2.04 seconds for the stone to hit the ground, we can find the horizontal distance traveled.

The horizontal distance is given by the equation:
d = ut

where:
d is the horizontal distance
u is the initial horizontal velocity

In this case, the stone is thrown horizontally, so the initial horizontal velocity, u, is equal to the speed at which it was thrown, which is 18 m/s.

Plugging in the values we know:
d = 18 * 2.04
d ≈ 36.7 meters

Therefore, the stone will travel approximately 36.7 meters horizontally before hitting the ground.