2) Two objects are initially the same height above the ground. Simultaneously, one is released from rest and the other is shot horizontally with an initial speed of 2.5 m/s. The two objects collide after falling 20.0 m. How far apart were the objects initially?

10.2m

fewqre

To solve this problem, we need to break it down into two parts: vertical motion and horizontal motion. Let's start with the vertical motion:

Let's assume that the time it takes for both objects to reach the ground is 't' seconds. We can use the equation of motion for vertical motion:

s = ut + 0.5gt^2

Where:
s = displacement (20.0 m)
u = initial velocity (0 m/s for the dropped object, since it's released from rest)
g = acceleration due to gravity (-9.8 m/s^2)

For the object that is dropped, the equation becomes:

20.0 = 0*t + 0.5*(-9.8)*t^2

Simplifying this equation gives:

4.9t^2 = 20.0

Dividing both sides by 4.9, we get:

t^2 = 4.08

Taking the square root of both sides, we find:

t = 2.02 seconds (approx.)

Now, let's move on to the horizontal motion:

For the object that is shot horizontally, we know that its initial velocity in the horizontal direction is 2.5 m/s. Since there is no acceleration in the horizontal direction, the horizontal displacement can be found using the equation:

s = ut

Where:
s = horizontal displacement
u = initial velocity in the horizontal direction
t = time taken to collide (2.02 seconds)

Plugging in the values, we get:

s = 2.5*2.02
s = 5.05 meters

So, the objects were initially 5.05 meters apart horizontally.