A technician at the top of a 200m tall radio tower drops her hammer.

how much time does it take for the hammer to strike the ground?

ke at bottom = (1/2) m v^2 = m g h

so v at bottom = sqrt (2 g h)
so average v = max v/2 = (1/2) sqrt(2 gh)
average v = .5 sqrt(2*9.81*200)
= 31.3 m/s
time = distance /speed
= 200 /average v = 200/31.3 = 6.39 seconds

h=gt²/2

t=sqrt(2h/g) =sqrt(2•200/9.8)=6.39 s

To calculate the time it takes for the hammer to strike the ground, we can use the equations of motion under constant acceleration. In this case, the hammer is dropped from rest, so its initial velocity is 0 m/s.

The height of the tower is 200 meters, and we can assume the acceleration due to gravity is approximately 9.8 m/s².

Let's use the equation:

s = ut + (1/2)at²

where:
s is the distance (height) traveled by the hammer (200 meters),
u is the initial velocity (0 m/s),
t is the time it takes for the hammer to fall (which is what we're trying to find), and
a is the acceleration due to gravity (-9.8 m/s², as it acts downwards).

Plugging in the values, the equation becomes:

200 = 0t + (1/2)(-9.8)t²

Simplifying the equation further:

200 = (-4.9)t²

Dividing both sides by -4.9:

(t²) = 200 / 4.9

t² ≈ 40.82

Taking the square root of both sides:

t ≈ √40.82

t ≈ 6.39 seconds

Therefore, it takes approximately 6.39 seconds for the hammer to strike the ground.