Two spheres are launched horizontally from a 1.1m -high table. Sphere A is launched with an initial speed of 5.0M/S . Sphere B is launched with an initial speed of 3.0M/S

A. What is the time for the sphere A to hit the floor? and sphere B to hit the ground
B. What is the distance that sphere A travels from the edge of the table? And what is the distance for sphere B travels from edge of the table??

the second question is very tricky. can you please give me the equation to use to find the distance??

i got the answer. thanks

To solve this problem, we can use the equations of motion for objects in free-fall.

For part A:
1. Find the time (t) for sphere A to hit the floor:
To do this, we can use the equation: h = (1/2) * g * t^2, where h is the height of the table and g is the acceleration due to gravity (approximately 9.8 m/s^2).
So, for sphere A, h = 1.1 m.
Plug these values into the equation and solve for t:
1.1 = (1/2) * 9.8 * t^2
Simplify the equation:
0.5 * 9.8 * t^2 = 1.1
4.9t^2 = 1.1 (dividing both sides by 0.5*9.8)
t^2 = 1.1 / 4.9 (dividing both sides by 4.9)
t^2 = 0.2245
t ≈ √0.2245 (taking the square root of both sides)
t ≈ 0.474 seconds (rounded to 3 decimal places)

So, sphere A takes approximately 0.474 seconds to hit the floor.

Now, let's find the time (t) for sphere B to hit the ground:
We can use the same equation: h = (1/2) * g * t^2, where h is the height of the table.
For sphere B, h = 1.1 m, as they both was launched from the same height.
Plug these values into the equation and solve for t:
1.1 = (1/2) * 9.8 * t^2
4.9t^2 = 1.1
t^2 = 1.1 / 4.9
t^2 = 0.2245
t ≈ √0.2245
t ≈ 0.474 seconds (rounded to 3 decimal places)

So, sphere B also takes approximately 0.474 seconds to hit the ground.

For part B:
To find the distance travelled by each sphere from the edge of the table, we need to calculate the horizontal distance (range) covered by each sphere.

The formula for the range (R) of a projectile launched horizontally is given by R = V * t, where V is the initial horizontal velocity and t is the time of flight.

1. For sphere A:
The initial horizontal velocity of sphere A is 5.0 m/s, and we now have the time t = 0.474 seconds.
R = 5.0 m/s * 0.474 s = 2.37 meters

So, sphere A travels approximately 2.37 meters from the edge of the table.

2. For sphere B:
The initial horizontal velocity of sphere B is 3.0 m/s, and the time t = 0.474 seconds (as calculated above).
R = 3.0 m/s * 0.474 s = 1.422 meters

So, sphere B travels approximately 1.422 meters from the edge of the table.

Since they are both launched horizontally, there is zero initial vertical velocity component in each case and the times that they take to hit the ground are the same.

That time T can be obtained by solving

(g/2) T^2 = 1.1 m

The horizontal distance travaled when they hit the ground is Vo*T

Vo = 3.0 or 5.0 m/s