a parachutist falling at the rate of 15 ft./sec. drops a stone. if he was 904 ft. above the ground when the stone was dropped, how long would it take the stone to reach the ground?

assuming a = -32

and seeing that the initial velocity was -15 from a height of 904, I know

s = -16r^2 - 15t + 904
we want s = 0

16t^2 + 15t - 904 = 0
won't waste time trying to factor it ...
t = (-15 ± √58081)/32
= (-15 ± 241)/32 ----- ahh, it does factor
= 226/32 or some silly negative
= 113/16 seconds or appr 7.1 seconds

btw, it would have been factored as
(16t - 113)(t + 8) = 0

Thank you. You’re so helpful.

To determine the time it takes for the stone to reach the ground, we can use a simple physics equation.

The equation that relates distance, initial velocity, time, and acceleration due to gravity is:
d = v0t + (1/2)gt^2

Where:
d = distance
v0 = initial velocity
g = acceleration due to gravity (approximately 32 ft/s^2)
t = time

In this case, the parachutist is falling at a constant rate of 15 ft/s. However, since we are only interested in the stone reaching the ground, we can consider the initial velocity of the stone to be zero since it was dropped. So, v0 = 0.

We also know that the distance traveled by the stone is the height from which it was dropped, which is 904 ft. Thus, d = 904 ft.

Plugging in the values into the equation, we have:
904 = 0 * t + (1/2) * 32 * t^2

Simplifying further:
904 = 16t^2

Dividing both sides by 16:
t^2 = 56.5

Taking the square root of both sides gives:
t ≈ 7.52 seconds

Therefore, it would take approximately 7.52 seconds for the stone to reach the ground.