An object is thrown from a height of 8.0m, with a horizontal velocity of 28.0m/s. How far will this object travel horizontally before hitting the ground?

t =sqrt(2h/g)

s =vt

I really did not get the formula

Please explain it to me

http://www.physicstutorials.org/home/mechanics/1d-kinematics/projectile-motion?start=1

To determine the horizontal distance the object will travel before hitting the ground, we need to first find the time it takes for the object to reach the ground. We can use the equation:

h = (1/2)gt^2

where:
- h is the initial height of the object (8.0m),
- g is the acceleration due to gravity (-9.8m/s^2),
- t is the time it takes for the object to reach the ground (unknown).

Rearranging the equation, we get:

t^2 = (2h/g)

Now we can substitute the values into the equation:

t^2 = (2 * 8.0m) / (-9.8m/s^2)
t^2 = 1.63s^2

Taking the square root of both sides:

t ≈ 1.28s

So, it takes approximately 1.28 seconds for the object to hit the ground.

Now, we can use this time to calculate the horizontal distance traveled by the object. We can use the equation:

d = v * t

where:
- d is the horizontal distance traveled (unknown),
- v is the horizontal velocity of the object (28.0m/s),
- t is the time it takes for the object to reach the ground (1.28s).

Substituting the values into the equation:

d = 28.0m/s * 1.28s
d ≈ 35.84m

Therefore, the object will travel approximately 35.84 meters horizontally before hitting the ground.