A ball is thrown horizontally at 10.0 m/s from the top of a hill 50.0 m high. How far from the base of the hill would the ball hit the ground?

I don't understand the following explained by Quittich (sp)?

"Since there is no vertical component, the time to hit the ground is the same as if th ball had been dropped from a height of 50.0m. Calculate how long it takes for the ball to fall 50.0m. Take that time and multiply by the horizontal velocity (given as 10.0 m/s)".

OK, here are the steps to go through to solve the time eqution:

Just to be sure...
The expression t^2 means t to the second power or t * t.
Solving...
50.0m = (0.5)*(9.8 m/s^2)*(t^2)
50.0m = (4.9 m/s^2)*(t^2)
dividing both sides by (4.9 m/s^2)
(50/4.9)s^2 = t^2
10.2 s^2 = t^2
taking the square root of both sides
3.19s = t

So the horizontal distance is:
(3.19 s) * (10.0 m/s) = 31.9m

ok. that's what I did it just didn't look right

The time that the ball is in the air just depends on how far it has to fall down to the ground. Since the ball was thrown HORIZONTALLY the initial VERTICAL velocity is 0. So, the ball immediately starts to drop. This happens at the same rate as if you just dropped it from the same height.

If you quickly rolled a ball off a table and then just as that ball rolled over the edge dropped another ball, they would both hit the ground at the same time because.

Back to the problem...
The time to hit the ground is then used to find out how far the ball traveled horizontally. The horizontal velocity is 10.0 m/s (given) and does not change. (Disregard air friction for this). So, the distance from the hill is just (time in the air) * (10.0 m/s).

Here is the formula to calculate the time in the air:

distance = (initial distance) + (initial velocity) * time + (0.5)*(acceleration)* (t^2)

Here, initial distance and initial velocity are both 0. So the equation becomes:
50.0m = (0.5)*(gravity)*(t^2)
Solve for t.

Then use the value you found for t.
horizontal distance = t * 10.0m/s

t=1.02

initial velocity=10.0

I'm confused on what to do but from what you said I used the equation you gave me and plugged the numbers I thought were right and I can get an answer of 31.9

that was before I knew you responded

OK, but when you get your answers, post them and we'll be sure they look good.

I think the horizontal velocity is 10.0m/s not 50.0m/s.

The explanation by "Quittich" is referring to the concept of projectile motion. When an object is thrown horizontally, it only has a horizontal velocity component and no vertical velocity component. This means that the object will only experience a horizontal motion and fall vertically under the force of gravity.

In this case, the ball is thrown horizontally at 10.0 m/s from the top of a hill 50.0 m high. The question is asking how far from the base of the hill the ball would hit the ground.

To solve this, we can first determine the time it takes for the ball to fall vertically from a height of 50.0 m. In projectile motion, the time it takes for an object to fall vertically is the same as if the object had been dropped from that height.

To calculate this time, we can use the equation of motion for free fall:

h = (1/2)gt^2

Where h is the vertical displacement (50.0 m in this case), g is the acceleration due to gravity (approximately 9.8 m/s^2), and t is the time.

Solving for t, we get:

t = sqrt(2h/g)
= sqrt(2 * 50.0 / 9.8)
= sqrt(10.2041)
≈ 3.19 s

So, it takes approximately 3.19 seconds for the ball to fall vertically from the top of the hill.

Now, since the ball is thrown horizontally, we can use the horizontal velocity (10.0 m/s) and the time (3.19 s) to determine how far it travels horizontally.

The distance traveled horizontally is given by the equation:

d = v * t

Where d is the distance, v is the horizontal velocity, and t is the time.

Plugging in the values, we get:

d = 10.0 * 3.19
≈ 31.9 m

Therefore, the ball would hit the ground approximately 31.9 meters from the base of the hill.

im just not getting anything. Im doing it 2 ways:

50.0m = (0.5)*(9.8)*(t^2)
4.9 = t
4.9*10=49

or 50.0m = (0.5)*(6.67 x 10^-11)*(t^2)
3.335 x 10^-11 = t
3.335 x 10^-11 times 10.0 = I cant even get that

Since there is no initial vertical velocity, gravity pulls the ball down 50 ft in t seconds which derives from h = Vot + gt^2/2.

SInce Vo = 0, 50 = 9.8t^2/2 making t = 3.19 sec.

In that 3.19 seconds, the ball travels horizontally a distance of d = 50(3.19) = 159.7m, ignoring air resistance.