One ball is dropped from a cliff. A second ball is thrown down 1.00 s later with an initial speed of 40.0 ft/s. How long after the second ball is thrown will the second ball overtake the first?

I could swear I did this one yesterday :(

ball 1
d = distance down
d = .5 g (t+1)^2

ball 2
d = 40 t + .5 g t^2

so
.5 g (t^2 + 2 t + 1) = 40 t + .5 g t^2
put in whatever you use for g (9.81, 9.8 or 10) and solve the quadratic

To find out the time it takes for the second ball to overtake the first, we can consider the vertical motion of each ball separately. Since both balls are falling vertically, we can use the equations of motion to solve the problem.

First, let's focus on the first ball that was dropped from the cliff. The equation to describe its motion is:

y₁ = y₀ + v₀t + (1/2)at²

where:
y₁ is the vertical displacement of the first ball at time t,
y₀ is the initial starting position (in this case, the height of the cliff),
v₀ is the initial velocity of the first ball (0 since it was dropped),
t is the time, and
a is the acceleration due to gravity (-32 ft/s²).

For the first ball, y₀ = 0 (at the top of the cliff) and v₀ = 0 (dropped from rest). Therefore, the equation simplifies to:

y₁ = (1/2)at²

Now, let's consider the second ball. The equation of motion for its vertical motion is:

y₂ = y₀ + v₀t + (1/2)at²

where:
y₂ is the vertical displacement of the second ball at time t,
y₀ is the initial starting position (height of the cliff),
v₀ is the initial velocity of the second ball (40 ft/s since it was thrown),
t is the time, and
a is the acceleration due to gravity (-32 ft/s²).

For the second ball, y₀ = 0 (at the top of the cliff). Rearranging the equation, we get:

y₂ - (1/2)at² = v₀t

Substituting the values we have, the equation becomes:

(1/2)(-32)t² - 40t = 0

Now we can solve this quadratic equation to find the time it takes for the second ball to overtake the first.