a contruction worker drops a hammer from a height of 50 feet. after how many seconds will the hammer land on the ground?

Use H=(1/2)gt²

where
H=height dropped (from rest)
g=acceleration due to gravity
32.2 ft/s² or 9.81 m/s²
t=time in seconds.
Solve for t.

To find out how many seconds it will take for the hammer to land on the ground, we can use the equation of motion for an object falling freely due to gravity.

The equation is:

h = (1/2) * g * t^2

Where:
h = height (measured in feet)
g = acceleration due to gravity (32.2 feet/second^2)
t = time (in seconds)

In this case, the height of the hammer is 50 feet. Let's substitute the given values into the equation:

50 = (1/2) * 32.2 * t^2

Now, let's simplify the equation:

50 = 16.1 * t^2

Divide both sides of the equation by 16.1:

50 / 16.1 = t^2

Approximately, 3.106 is equal to t^2.

Now, take the square root of both sides to solve for t:

√(50 / 16.1) ≈ t

This gives us approximately:

t ≈ 1.76 seconds

So, it will take approximately 1.76 seconds for the hammer to land on the ground.