posted by NENE .
Two spheres are launched horizontally from a 1.1m -high table. Sphere A is launched with an initial speed of 5.0M/S . Sphere B is launched with an initial speed of 3.0M/S
A. What is the time for the sphere A to hit the floor? and sphere B to hit the ground
B. What is the distance that sphere A travels from the edge of the table? And what is the distance for sphere B travels from edge of the table??
Since they are both launched horizontally, there is zero initial vertical velocity component in each case and the times that they take to hit the ground are the same.
That time T can be obtained by solving
(g/2) T^2 = 1.1 m
The horizontal distance travaled when they hit the ground is Vo*T
Vo = 3.0 or 5.0 m/s
the second question is very tricky. can you please give me the equation to use to find the distance??
i got the answer. thanks