A brick is thrown vertically upward with an initial speed of 4.40 m/s from the roof of a building. If the building is 74.0 m tall, how much time passes before the brick lands on the ground?

What is your thinking on this?

Two of us answered a very similar question from you yesterday.

Right here:
http://www.jiskha.com/display.cgi?id=1265922413#1265922413.1265936139
Please try it before posting it using the same set of equations.

To find the time it takes for the brick to land on the ground, we can use the laws of motion and the concept of free fall.

First, let's analyze the motion of the brick. The initial velocity when it is thrown upward is +4.40 m/s. The acceleration due to gravity is -9.8 m/s^2, considering the negative sign indicates that the acceleration is directed downward.

The brick will continue to rise until its velocity becomes zero at the highest point. At this point, the brick starts to fall downward due to gravity. It will then continue to accelerate downward until it reaches the ground.

To solve for the time it takes for the brick to land on the ground, we can use the following equation of motion:

Final velocity (v) = Initial velocity (u) + (acceleration (a) * time (t))

When the brick lands on the ground, the final velocity (v) will be 0 m/s since the brick comes to rest. The initial velocity (u) is +4.40 m/s, and the acceleration (a) is -9.8 m/s^2.

Substituting the values into the equation, we have:

0 = 4.40 - 9.8 * t

Simplifying the equation, we get:

9.8 * t = 4.40

Dividing both sides by 9.8, we find:

t = 4.40 / 9.8

Calculating this, we get:

t ≈ 0.449 seconds

Therefore, it takes approximately 0.449 seconds for the brick to land on the ground.