a rock is thrown upward with a velocity of 20 meters per second from the top of a 40 meter high cliff, and it misses the cliff on the way back down. when will the rock be 7 meters from the water, below? round your answer to two decimal places

letting the water be at height 0, just solve

40 + 20t - 4.9t^2 = 7

t=56

To find the time at which the rock is 7 meters from the water below, we can use the equations of motion.

First, let's calculate the time it takes for the rock to reach its highest point. We can use the equation:

v = u + at

Where:
v = final velocity (0 m/s at the highest point)
u = initial velocity (20 m/s upward)
a = acceleration (due to gravity, which is approximately -9.8 m/s^2)
t = time

Rearranging the equation, we have:

t = (v - u) / a

Substituting the values, we get:

t = (0 - 20) / -9.8
t = 20 / 9.8
t ≈ 2.04 seconds

So, it takes about 2.04 seconds for the rock to reach its highest point.

Next, we can find the time it takes for the rock to fall back to the water level. Since the displacement is 40 meters upward, we can treat it as a negative displacement of -40 meters. Using the equation:

s = ut + (1/2)at^2

Where:
s = displacement (negative 40 meters)
u = initial velocity (0 m/s at the highest point)
a = acceleration (due to gravity, which is approximately -9.8 m/s^2)
t = time

Substituting the values, we have:

-40 = 0*t + (1/2)(-9.8)*t^2

Simplifying the equation, we get:

-20t^2 = -40
t^2 = 2
t = √2
t ≈ 1.41 seconds

So, it takes about 1.41 seconds for the rock to fall back to the water level.

Now, we need to find the time at which the rock is 7 meters from the water below. We can divide the total time it takes for the rock to reach its highest point and fall back by 2, as the rock is at 7 meters for an equal amount of time on the way up and the way down.

t_total = t_up + t_down
t_total = 2.04 + 1.41
t_total ≈ 3.45 seconds

t_7 = t_total / 2
t_7 = 3.45 / 2
t_7 ≈ 1.72 seconds

So, the rock will be 7 meters from the water below approximately 1.72 seconds after it is thrown upwards.