A rock is thrown horizontally from a cliff overlooking the ocean. If the cliff is 45 meters high and the rock is thrown at 18 m/s, what is the angle at which it strikes the ocean?

bhjb ,

To find the angle at which the rock strikes the ocean, we first need to determine the time it takes for the rock to reach the ocean.

Since the rock is thrown horizontally, the initial vertical velocity is 0 m/s. The only force acting on the rock in the vertical direction is gravity, causing it to fall freely. The equation for the vertical displacement is given by:

y = ut + (1/2)gt^2

Where:
y = vertical displacement (in this case, the height of the cliff)
u = initial vertical velocity (0 m/s)
t = time taken
g = acceleration due to gravity (9.8 m/s^2)

Plugging in the values, we get:

45 = (1/2)(9.8)t^2

Simplifying, we find:

t^2 = 45 / (1/2)(9.8)
t^2 = 45 / 4.9
t^2 = 9.18
t ≈ √9.18
t ≈ 3.03 seconds

Now that we know the time it takes for the rock to reach the ocean, we can find the horizontal distance traveled by the rock using the equation:

d = vt

Where:
d = horizontal distance traveled
v = initial horizontal velocity (18 m/s)
t = time taken (3.03 seconds)

Plugging in the values, we get:

d = 18 * 3.03
d ≈ 54.54 meters

With the vertical displacement (45 meters) and horizontal displacement (54.54 meters), we can calculate the angle using the tangent function:

tan(θ) = y / x

Where:
θ = angle of impact
y = vertical displacement (45 meters)
x = horizontal displacement (54.54 meters)

Plugging in the values, we get:

tan(θ) = 45 / 54.54
θ = atan(45 / 54.54)
θ ≈ 39.53 degrees

Therefore, the rock strikes the ocean at an angle of approximately 39.53 degrees.