Suppose an astronaut drops a feather from 1.4 m above the surface of the Moon. If the acceleration due to gravity on the Moon is 1.62 m/s2 downward, how long does it take the feather to hit the Moon's surface?

duplicate post; already answered

To calculate the time it takes for the feather to hit the Moon's surface, we can use the equation of motion:

\[ s = ut + \frac{1}{2}at^2 \]

where:
- s is the displacement (distance)
- u is the initial velocity (which is 0 in this case since the feather is dropped)
- a is the acceleration due to gravity on the Moon (1.62 m/s²)
- t is the time

Since the feather is dropped, the initial velocity (u) is 0. So the equation simplifies to:

\[ s = \frac{1}{2}at^2 \]

In this case, the displacement (s) is the distance the feather falls, which is the distance from 1.4 m above the surface to the surface itself. So:

\[ s = 1.4 \, \text{m} \]

By substituting the known values, we can solve for t.

\[ 1.4 \, \text{m} = \frac{1}{2} \times 1.62 \, \text{m/s}^2 \times t^2 \]

To isolate t, we can rearrange the equation:

\[ 2 \times 1.4 \, \text{m} = 1.62 \, \text{m/s}^2 \times t^2 \]

\[ 2.8 \, \text{m} = 1.62 \, \text{m/s}^2 \times t^2 \]

Now, we divide both sides of the equation by 1.62 to solve for \( t^2 \):

\[ \frac{2.8 \, \text{m}}{1.62 \, \text{m/s}^2} = t^2 \]

\[ t^2 \approx 1.728 \, \text{s}^2 \]

Finally, we take the square root of both sides to find the value of t:

\[ t \approx \sqrt{1.728 \, \text{s}^2} \]
\[ t \approx 1.314 \, \text{s} \]

Therefore, it takes approximately 1.314 seconds for the feather to hit the Moon's surface.