An object is thrown upwards at an initial speed of 19.6 m/s from a height of 65m. Determine how long will take for the object to reach the ground (time of flight). what formula should i use?

http://www.jiskha.com/display.cgi?id=1345615393

To determine the time it takes for the object to reach the ground (time of flight), we can use the kinematic equation:

h = ut + (1/2)at^2

Where:
h = initial height (65m)
u = initial velocity (19.6 m/s)
a = acceleration due to gravity (-9.8 m/s^2, considering it is acting in the downward direction)
t = time

Since the object is thrown upwards, the final height will be zero when it reaches the ground. Plugging in the known values into the equation, we get:

0 = (19.6)t + (0.5)(-9.8)(t^2)

Simplifying the equation gives:

0 = 19.6t - 4.9t^2

Rearranging the equation, we have:

4.9t^2 - 19.6t = 0

Now, we can factor out t from the equation:

t(4.9t - 19.6) = 0

This equation has two solutions: t = 0 and 4.9t - 19.6 = 0.

The first solution (t = 0) represents the initial time when the object is thrown, so we need to consider the second solution:

4.9t - 19.6 = 0

Solving for t, we get:

4.9t = 19.6
t = 19.6 / 4.9
t = 4 seconds

Therefore, it will take 4 seconds for the object to reach the ground (time of flight).