Two balls (Ball 1 and Ball 2) are released from the top of a tower. Ball 2 is thrown 3.14 seconds after Ball 1 is dropped. Ball 2 is thrown downward with a velocity of 3.49 m/s. Determine how far Ball 1 has fallen (to two decimal points) by the time Ball 2 is thrown.

If ball1 has been dropping for 3.4 seconds, then ball1 has dropped:

y= vo*t+(1/2)*g t^2 where vo= 0.0 since is was "dropped", not thrown.

or y=.5*9.8*3.49^2= 59.68meters

To determine how far Ball 1 has fallen by the time Ball 2 is thrown, we need to calculate the distance Ball 1 has traveled in 3.14 seconds.

To find the distance fallen by Ball 1, we can use the kinematic equation for distance:

d = 1/2 * g * t^2

where d is the distance, g is the acceleration due to gravity (approximately 9.8 m/s^2), and t is time.

By substituting the given values into the equation, we have:

d = 1/2 * 9.8 * (3.14)^2

d ≈ 1/2 * 9.8 * 9.8596

d ≈ 48.48568

Therefore, Ball 1 has fallen approximately 48.49 meters by the time Ball 2 is thrown.

whoops, I noticed I used 3.4 seconds not 3.14

y= .5*9.8* 3.14^2= 48.31