You throw a ball straight into the air from a height of 4 feet and with a speed of 8.22 m/s. The moment the ball leaves your hand you start running away at a speed of 3.36 m/s. How far are you from the ball, the moment it hits the ground?

I just did this as well

look at the bottom of this page and click on what looks familiar

To find out how far you are from the ball when it hits the ground, we need to determine the time it takes for the ball to hit the ground.

The height the ball is thrown from is 4 feet, which can be converted to meters by multiplying by 0.3048 (since 1 foot = 0.3048 meters). So, the initial height of the ball is 4 x 0.3048 = 1.2192 meters.

We can use the kinematic equation for vertical motion to find the time it takes for the ball to hit the ground:

h = vi * t + (1/2) * g * t^2

Where:
h = height (1.2192 meters)
vi = initial velocity (8.22 m/s)
t = time
g = acceleration due to gravity (approximately 9.8 m/s^2)

Rearranging the equation, we get:

(1/2) * g * t^2 + vi * t - h = 0

Solving this quadratic equation will give us the time it takes for the ball to hit the ground. Using the quadratic formula:

t = (-vi ± √(vi^2 - 4 * (1/2) * g * -h)) / (2 * (1/2) * g)

Plugging in the values:

t = (-8.22 ± √(8.22^2 - 4 * 0.5 * 9.8 * -1.2192)) / (2 * 0.5 * 9.8)

After solving this equation, you'll get two solutions. Since time cannot be negative in this context, we can disregard the negative solution.

Once you have obtained the positive time, you can find the distance you have covered by multiplying your running speed by the time. In this case, you are running away from the ball at a speed of 3.36 m/s.

The distance you are from the ball when it hits the ground is then the sum of the distance covered by your running speed and the initial distance from where the ball was thrown.