A rock is launched off a table horizontally with a velocity of 10 m/s. If the table is 1.5 m above the floor, how long will it take for the projectile to hit the floor? What distance will the rock land in respect from the edge of the table?

2(1.5)/9.8=t^2. T=.553? What is the formula for the second part?

To find the time it takes for the rock to hit the floor, we need to use the kinematic equation for vertical motion. The formula for calculating time when an object is dropped from a height can be given as:

h = (1/2) * g * t^2

where:
h is the height from which the object falls (1.5 m in this case),
g is the acceleration due to gravity (approximated as 9.8 m/s^2 on Earth),
t is the time it takes for the object to hit the floor (what we need to find).

Rearranging the equation to solve for t, we have:

t^2 = (2 * h) / g

Plugging in the given values, we have:

t^2 = (2 * 1.5) / 9.8

t^2 = 0.306
t ≈ √0.306
t ≈ 0.553 seconds

So, the time it takes for the rock to hit the floor is approximately 0.553 seconds.

Now, to determine the distance the rock will land from the edge of the table, we need to use the formula for horizontal distance traveled during projectile motion. The formula for calculating horizontal distance is:

d = v * t

where:
d is the distance traveled,
v is the horizontal velocity of the object (10 m/s in this case),
t is the time (0.553 seconds in this case, as we just calculated).

Plugging in the given values, we have:

d = 10 * 0.553

d ≈ 5.53 meters

Therefore, the rock will land approximately 5.53 meters away from the edge of the table in horizontal distance.