A diver running 1.0 m/s dives out horizontally from the edge of a vertical cliff and reaches the water below 2.8 s later. How high was the cliff and how far from its base did the diver hit the water?

See following link for an almost identical example:

http://www.jiskha.com/display.cgi?id=1285613339

To solve this problem, we can utilize the kinematic equations of motion. Let's break it down into two parts:

1. Determine the height of the cliff.
2. Calculate the horizontal distance from the base of the cliff to the point where the diver hits the water.

1. Determining the height of the cliff:
The vertical motion of the diver can be described by the equation:
h = vit + 1/2gt^2

Where:
h = height of the cliff (what we need to find)
vi = initial vertical velocity of the diver (0 m/s, as the diver starts from rest vertically)
t = time taken by the diver to reach the water (2.8 s)
g = acceleration due to gravity (-9.8 m/s^2, as it acts downward)

Plugging in the given values:
h = (0)(2.8) + 1/2(-9.8)(2.8)^2

Simplifying the equation:
h = 1/2(-9.8)(7.84)
h = -48.496 (m)

Since height cannot be negative, we take the absolute value of the result:
h ≈ 48.5 m

Therefore, the height of the cliff is approximately 48.5 meters.

2. Calculating the horizontal distance from the base of the cliff to the point where the diver hits the water:
The horizontal distance can be determined using the formula:
d = vt

Where:
d = horizontal distance traveled
v = horizontal velocity of the diver (1.0 m/s, given in the question)
t = time taken by the diver to reach the water (2.8 s)

Plugging in the given values:
d = (1.0 m/s)(2.8 s)
d = 2.8 m

Therefore, the diver hits the water approximately 2.8 meters away from the base of the cliff horizontally.

In summary, the height of the cliff is approximately 48.5 meters, and the diver hits the water approximately 2.8 meters away from the base of the cliff horizontally.