A robotic basketball player tosses a ball upward from a height of 7 feet, at an initial speed of 111 feet per second.

The height of the ball is modeled by the equation: h=−16t2+111t+7, where h is the height above the ground in feet and t is the time measured in seconds.
How long does it take for the ball to hit the ground?

just solve for t when h=0

good old algebra I

7sec

To find the time it takes for the ball to hit the ground, we need to determine when the height of the ball is equal to zero. In other words, we need to solve the equation:

h = -16t^2 + 111t + 7 = 0

To solve this quadratic equation, we can use the quadratic formula:

t = (-b ± sqrt(b^2 - 4ac)) / (2a)

In this case, a = -16, b = 111, and c = 7. Substituting these values into the quadratic formula, we can calculate the time it takes for the ball to hit the ground.