Posted by **A.S. ** on Sunday, July 24, 2011 at 2:49pm.

Suppose a scientist on Earth throws a baseball upward. The scientist lets go of the ball 2 meters above the ground with an initial velocity of 10 meters per second. How long does it take for the ball to hit the ground (H=0)? (Use the quadratic formula to solve this problem, show all work). Write a statement to interpret your results.

## Answer This Question

## Related Questions

- Algebra 2 - a person throws a baseball into the air with an initial vertical ...
- physics - Julie throws a rock vertically upward. From a height of 64.0 meters ...
- Algebra 1 - A ball is thrown upward with an initial velocity of 35 meters per ...
- Algebra - The height, in meters, of a projectile can be modeled by h= -4.9t^2 + ...
- algebra - A ball is thrown upward with an initial velocity of 14 meters per ...
- Math - The equation y=-4.9t^2+3.5t+5 describes the height (in meters) of a ball ...
- math - A ball is thrown upward with an initial velocity of 35 meters per second ...
- Physics - A 0.255 kg ball was thrown vertically upward from a bridge 10.3 meters...
- calculus - Use a(t) = -9.8 meters per second per second as the acceleration due ...
- physics - A plane flying horizontally above Earth’s surface at 184 meters per ...

More Related Questions