Gravity on the moon is about one-sixth of gravity on Earth. An astronaut standing on a tower 20 feet above the moon's surface throws a ball upward with a velocity of 30 feet per second. The height of the ball at any time t (in seconds) is h(t) = -2.67t squared + 30t + 20. To the nearest tenth of a second, how long will it take for the ball to hit the ground.

h(t) will be zero at the ground

0=-2.67t^2+30t+20

Use the quadratic equation.

10.83 seconds

To find the time it takes for the ball to hit the ground, we need to determine when the height, h(t), becomes zero.

Given the equation for the height of the ball as a function of time, h(t) = -2.67t^2 + 30t + 20, we can set h(t) equal to zero and solve for t.

-2.67t^2 + 30t + 20 = 0

To solve this quadratic equation, we can use the quadratic formula:

t = (-b ± √(b^2 - 4ac)) / (2a)

In this case, a = -2.67, b = 30, and c = 20.

Plugging these values into the quadratic formula:

t = (-30 ± √(30^2 - 4(-2.67)(20))) / (2(-2.67))

Simplifying further:

t = (-30 ± √(900 - (-214.4))) / (-5.34)

t = (-30 ± √(900 + 214.4)) / (-5.34)

t = (-30 ± √(1114.4)) / (-5.34)

Using a calculator to find the square root of 1114.4:

t ≈ (-30 ± 33.4) / (-5.34)

We have two possible values for t:

t ≈ (-30 + 33.4) / (-5.34) ≈ 0.6 seconds
t ≈ (-30 - 33.4) / (-5.34) ≈ 11.4 seconds

Since the ball is thrown upward, we can disregard the negative value and consider the positive value as the time it takes for the ball to hit the ground.

Therefore, to the nearest tenth of a second, it will take approximately 0.6 seconds for the ball to hit the ground.