A student stands at the edge of a cliff and throws a stone horizontally over the edge with a speed of 18 m/s. the cliff is 50 m above a flat, horizontally beach. how long after being released does the stone strike the beach below the cliff?

distance = 1/2*g*t^2

solve for t. Check m thinking.

We can talk about this projection as two simultaneous and perpendicular motion.

Assuming that the height of the student can be omitted, the gravity is a constant with a value of 10m/s, no friction or disturbance in air and the Earth is an inertial frame:

The time of flight can be gained by using S = ut + (1/2)*at^2 downwards and as the initial value for the vertical velocity is 0 we can omit the 'ut' part.

50 = (1/2)*10t^2
t = √10 s

The horizontal distance can be gained through S = ut (for the rock flies with a uniform horizontal velocity and with no horizontal acceleration).

S = 18*√10
S = 18√10 m

To find the time it takes for the stone to strike the beach, we can use the kinematic equation for horizontal motion:

distance = speed × time

In this case, the distance traveled horizontally by the stone is the same as the distance from the edge of the cliff to the point directly below it on the beach, which is 50 m. The speed of the stone is 18 m/s.

So, we need to find the time it takes for a stone traveling at a constant speed of 18 m/s to cover a distance of 50 m horizontally.

Using the formula, we can rearrange it to solve for time:

time = distance / speed

Substituting the values, we have:

time = 50 m / 18 m/s

Simplifying the equation, we get:

time ≈ 2.78 s

Therefore, it takes approximately 2.78 seconds for the stone to strike the beach below the cliff after being released.