If object B is dropped from a height of 100 m while moving horizontally at 115 m/s. How much time will it take for object B to hit the ground? (round answer to the tenth's place and use "s" to describe your answer)

100 = 1/2 g t^2

hf=1/2 g t^2

t= sqrt (2h/g)

To find the time it takes for object B to hit the ground, we can use the equation of motion for vertical motion under the influence of gravity:

h = (1/2)gt^2

Where:
h is the height of the object (100 m in this case),
g is the acceleration due to gravity (approximately 9.8 m/s^2),
t is the time it takes for the object to hit the ground (what we want to find).

We need to determine the time it takes for the object to fall vertically. Since the horizontal velocity does not affect the vertical motion, we can ignore it in this calculation.

Rearranging the equation, we have:

t = √(2h / g)

Now, let's substitute the known values into the equation:

t = √(2 * 100 m / 9.8 m/s^2)

Calculating further:

t = √(20.4 s^2)

t ≈ 4.5 s (rounding to the nearest tenth)

Therefore, it will take approximately 4.5 seconds (4.5 s) for object B to hit the ground.