Suppose an astronaut drops a feather from 1.3 m above the surface of the Moon. If the acceleration due to gravity on the Moon is 1.62 m/s2 downward, how long does it take the feather to hit the Moon's surface?

To find the time it takes for the feather to hit the Moon's surface, we can use the kinematic equation:

\[d = v_i t + \frac{1}{2} a t^2\]

Where:
d = displacement (distance the feather falls)
v_i = initial velocity (which is 0, as the feather is dropped)
a = acceleration due to gravity on the Moon (-1.62 m/s^2, as it is pointing downwards)
t = time

Since the feather is dropped from rest, its initial velocity, \(v_i\), is 0. Therefore, the equation simplifies to:

\[d = \frac{1}{2} a t^2\]

Substituting the given values, we have:

1.3 m = \(\frac{1}{2}(1.62 m/s^2) t^2\)

Now, let's solve for t. Rearranging the equation, we get:

\(t^2 = \frac{2 \cdot 1.3 m}{1.62 m/s^2}\)

\(t^2 = \frac{2.6 m}{1.62 m/s^2}\)

\(t^2 = 1.6049382716 s^2\)

To solve for t, we need to take the square root of both sides:

\[t = \sqrt{1.6049382716 s^2} \approx 1.27 s\]

Therefore, it takes approximately 1.27 seconds for the feather to hit the Moon's surface when dropped from 1.3 meters above.

d=1/2 a t^2, correct?