A brick is thrown vertically upward with an initial speed of 5.00 m/s from the roof of a building. If the building is 112.0 m tall, how much time passes before the brick lands on the ground?

a = -g = -9.8

V = Vinitial + a t = 5 - 9.8 t

h = Hinitial + Vi t +(1/2)at^2 = 112 + 5t-4.9t^2 = 0 at the end

so
112 + 5 t -4.9 t^2 = 0

4.9t^2 -5 t -112 = 0
solve quadratic, use positive t

To solve this problem, we can use the equations of motion for motion in one dimension.

The equation that relates the final velocity (v), initial velocity (u), acceleration (a), and time (t) is:

v = u + at

In this case, the brick is thrown vertically upward, so the acceleration is due to gravity, which is approximately 9.8 m/s^2. The initial velocity (u) is 5.00 m/s. We want to find the time it takes for the brick to reach the ground, so we can assume that the final velocity (v) is 0 m/s, as it will be momentarily at rest when the brick lands.

Using the equation, we can rearrange it to solve for time (t):

v = u + at
0 = 5.00 m/s - 9.8 m/s^2 * t

Simplifying the equation, we get:

-5.00 m/s = -9.8 m/s^2 * t
t = -5.00 m/s / -9.8 m/s^2
t = 0.51 s

Therefore, it takes approximately 0.51 seconds for the brick to land on the ground.