a man stands on the roof of a building and throws a stone upwards at 15 ms-1. after what time will the stone hit the ground 20m below

you want to find when the height h=0, starting from 20m up:

h(t) = 20 + 15t - 4.9t^2

Thanks a lot

To find the time it takes for the stone to hit the ground, we need to use the equation of motion:

s = ut + (1/2)at^2

Where:
s = distance (20m, in this case)
u = initial velocity (15 m/s)
a = acceleration due to gravity (-9.8 m/s²)

Since the stone is thrown upwards, the acceleration due to gravity will be negative. Plugging in the values into the equation, we get:

20 = 15t + (1/2)(-9.8)t^2

Simplifying the equation, we have:

-4.9t^2 + 15t - 20 = 0

To solve for 't', we can use the quadratic formula:

t = (-b ± √(b^2 - 4ac)) / (2a)

Plugging in the values, we have:

t = (-15 ± √(15^2 - 4(-4.9)(-20))) / (2(-4.9))

Calculating further, we get:

t = (-15 ± √(225 - 392)) / (-9.8)
t = (-15 ± √(-167)) / (-9.8)

The discriminant (b^2 - 4ac) under the square root is negative, which means there are no real solutions. Therefore, the stone will not hit the ground.