A baseball is thrown horizontally from the top of a cliff 50 meters up. The initial velocity of the ball is 10 m/s. How far from the base of the cliff will the ball land?

how long to fall 50 meters?

d = 4.9 t^2
t = sqrt(50/4.9) = 3.19 seconds
so how far did it go horizontally?
d = u t = 10 * 3.19
= 31.9 meters

To find out how far the baseball will land from the base of the cliff, we need to consider its horizontal motion. Since the ball is thrown horizontally, its vertical motion is not relevant to this problem. The only force acting horizontally on the ball is its initial velocity.

We can use the formula for horizontal motion: distance = velocity × time. In this case, we need to find the time it takes for the ball to reach the ground. Since the ball is thrown horizontally, its initial vertical velocity is 0 m/s, and the only force acting vertically is gravity, which causes the ball to accelerate downwards.

To find the time it takes for the ball to fall from the top of the cliff to the ground, we can use the formula: distance = (1/2) × acceleration × time^2. The distance is the vertical distance of the cliff, which is 50 meters. The acceleration due to gravity is 9.8 m/s^2, and we want to solve for time.

Substituting the values into the formula: 50 m = (1/2) × 9.8 m/s^2 × time^2.

Rearranging the equation to solve for time:
time^2 = (2 × 50 m) / 9.8 m/s^2 = 10.2 s^2

Taking the square root of both sides to solve for time:
time ≈ √10.2 ≈ 3.2 s

Now that we know the time it takes for the ball to reach the ground, we can use this time to find the horizontal distance traveled by the ball. Since the initial horizontal velocity is 10 m/s and there is no horizontal acceleration, the horizontal distance is distance = velocity × time.

Substituting the values into the formula:
distance = 10 m/s × 3.2 s ≈ 32 meters

Therefore, the baseball will land approximately 32 meters from the base of the cliff.