A brick is thrown vertically upward with an initial speed of 4.40 m/s from the roof of a building. If the building is 74.0 m tall, how much time passes before the brick lands on the ground? I don't get how to solve it still

Your constant name changes are annoying. Show work for further assistance.

I got 0=4.9t^2-4.4t-74

How can I find t from here?

Use the quadratic equation.

To solve this problem, we can use the equations of motion under constant acceleration. In this case, the only significant force acting on the brick is the force of gravity, which causes it to accelerate downwards.

First, we need to find the time it takes for the brick to reach its highest point. At the highest point, the vertical velocity becomes zero before the brick starts falling back down. To find this time, we can use the equation:

vf = vi + at

where vf is the final velocity, vi is the initial velocity, a is the acceleration, and t is the time. Since the final velocity at the highest point is 0 m/s, the equation becomes:

0 = 4.40 m/s + (-9.81 m/s^2) * t

Solving for t, we get:

t = 4.40 m/s / 9.81 m/s^2

t ≈ 0.448 seconds (rounded to three decimal places)

This gives us the time it takes for the brick to reach its highest point. To find the total time it takes for the brick to fall back down to the ground, we double this time because the time it takes to go up is equal to the time it takes to come back down.

Therefore, the total time it takes for the brick to land on the ground is:

2 * 0.448 seconds = 0.896 seconds (rounded to three decimal places)

So, it takes approximately 0.896 seconds for the brick to land on the ground.