science
posted by Mykmoloko on .
A projectile is fired with an initial speed of 120m.s' at an angle of 60' above the horizontal from the top of a 50m high cliff. (a) determine the maximum height(above the cliff) reached by the projectile (b) how long it takes to fall to the ground from the max. height?. Please give me some help i'm confuse.

use formula vfy= viy+gt...
vfy= 0, this is always true when the object reaches its maximum height.
viy= vi sin degress (horizontal)
viy= 120sin 60
viy= 103.92
g= 9.81 m/s2
thus:
vfy= viy + gt
0= 103.92 + (9.81)(t)
t = 1o3.92/9.81
1o.59seconds from the top of the cliff.
getting the time...
compute for the dy or the height from the top of the cliff...
dy= viyt + 1/2gt (time is squared)
add then the dy from the top of the cliff with the height of your cliff which is 5o to get the max.height.
using the new dy, (50 + the dy you computed from the top of the cliff), use the same formula of dy... now the viy is 0 since when the object is already falling, the final speed when the object has reached it max height is o, going down, the value vf will become the vi of the initial speed.
(im not sure but i guess this is how it is done)
the get the distance of the flight by:
dy=viyt  1/2 gt2 (please square the t,cant do superscript here). Suppose you get the distance