A brick is thrown vertically upward with an initial speed of 3.00 m/s from the roof of a building. If the building is 78.4 m tall, how much time passes before the brick lands on the ground? I think I may have gotten part of this question by first calculating how long it takes the brick to come back down to the level from which it was thrown, but the rest of the process (figuring out how long before it hits the ground from that point) is really puzzling me. Any help would be great!

Nevermind. I got something close to what I think I'm supposed to get (I calculated 3.71 s). The only thing is, when I plug my time back into the equation to check for correctness, I end up getting a negative 78.4 m, which obviously doesn't equal positive 78.4 m. Any thoughts on what I might have done?

Sure, I can help you with that! To find out how much time passes before the brick lands on the ground, we can break the problem into two parts: calculating the time it takes for the brick to reach its maximum height and calculating the time it takes for the brick to fall back down to the ground from that height.

First, let's calculate the time it takes for the brick to reach its maximum height. Since the brick is thrown vertically upward, its initial velocity is positive (+3.00 m/s) and its final velocity at the highest point is zero. We can use the kinematic equation:

v_f = v_i + a * t

Where:
- v_f is the final velocity
- v_i is the initial velocity
- a is the acceleration
- t is the time

In this case, the acceleration is due to gravity and is approximately -9.8 m/s² (negative because it acts in the opposite direction of the initial velocity). We want to find t when v_f = 0.

0 = 3.00 m/s - 9.8 m/s² * t

Rearranging the equation, we have:

9.8 m/s² * t = 3.00 m/s

t = 3.00 m/s / 9.8 m/s²

Calculating that, we find:

t ≈ 0.3061 s

So it takes approximately 0.3061 seconds for the brick to reach its maximum height.

Now, we can calculate the time it takes for the brick to fall back down to the ground from that maximum height. The time it takes to fall is the same as the time it took to rise, so it will also be approximately 0.3061 seconds.

To find the total time, we add the time it took for the brick to reach the maximum height to the time it takes for it to fall back down:

Total time = 0.3061 s + 0.3061 s

Total time ≈ 0.6122 s

Therefore, it takes approximately 0.6122 seconds for the brick to land on the ground.

I hope this explanation helps! Let me know if you have any further questions.