"a ball is thrown horizontally from the top of a cliff 256 ft high with an intial speed of 50 ft/sec. find the time of flight of the ball and the distance from thr base of the cliff to the point where the ball lands"

Good grief, feet and pounds? Well, just for now g = 32 ft/s^2

The horizontal speed is 50 feet/seconds until the ball hits the ground since there are no horizontal components of force.

256 = (1/2) g t^2 = 16 t^2
t = 4 seconds

d = u t = 50 * 4 = 200 feet

To find the time of flight and the distance from the base of the cliff to the point where the ball lands, we can use the equations of motion.

First, let's determine the time of flight. Since the ball is thrown horizontally, it only experiences motion in the vertical direction due to gravity. The vertical motion can be described using the equation:

h = ut + (1/2)gt^2

Where:
h = height (256 ft)
u = initial vertical velocity (0 ft/s since the ball is thrown horizontally)
g = acceleration due to gravity (32.2 ft/s^2)
t = time of flight

Substituting the given values into the equation, we have:

256 = 0*t + (1/2)*32.2*t^2
256 = 16.1*t^2

Dividing both sides by 16.1, we get:

t^2 = 15.79

Taking the square root of both sides, we find:

t ≈ 3.97 seconds

Therefore, the time of flight of the ball is approximately 3.97 seconds.

Next, let's calculate the horizontal distance traveled by the ball. Since the initial horizontal velocity is constant throughout the motion, we can use the equation:

d = ut

Where:
d = horizontal distance
u = initial horizontal velocity (50 ft/s)
t = time of flight (3.97 seconds)

Substituting the given values into the equation, we have:

d = 50 * 3.97
d ≈ 198.5 feet

Therefore, the distance from the base of the cliff to the point where the ball lands is approximately 198.5 feet.