If an object is dropped from a 686 foot cliff and hits the ground in 7 seconds, how far did it fall in the first 3 seconds?

To calculate the distance an object falls in a given time, we can use the equation for motion due to gravity:

d = 0.5 * g * t^2

Where:
d is the distance fallen
g is the acceleration due to gravity (which is approximately 32.2 feet per second squared)
t is the time in seconds.

In this case, we have:
g = 32.2 ft/s^2
t = 3 seconds

To find the distance fallen in the first 3 seconds, we can substitute these values into the equation:

d = 0.5 * 32.2 * 3^2
d = 0.5 * 32.2 * 9
d = 145.35 feet

Therefore, the object fell approximately 145.35 feet in the first 3 seconds.