You want to figure out how fast you can throw a ball, but don't have access to a radar gun. You measure the height from the ground to your hand as you throw and discover that when you throw a ball it leaves your hand at a height of 2 m from the ground. You then throw a ball horizontally and discover that it hits the ground 20 m from where you're standing. How fast in m/s did you throw the ball?

h=gt²/2=>

t=sqrt(2h/g).

s=vt=>
v=s/t=s/ sqrt(2h/g)=
=20/sqrt(2•2/9.8)=31.3 m/s

To determine the speed at which you threw the ball, you can use the principles of projectile motion. The key is to find the time it takes for the ball to travel horizontally. Here's how you can do it:

Step 1: Calculate the time of flight
Since the ball is thrown horizontally, it experiences only the effect of gravity in the vertical direction. The vertical motion can be calculated using the equation:
h = (1/2)gt^2
where h is the height (2 m) and g is the acceleration due to gravity (approximately 9.8 m/s^2).

Using this equation, we can solve for time (t):
2 = (1/2)(9.8)t^2
t^2 = 2/(1/2)(9.8)
t^2 = 2/4.9
t ≈ √(2/4.9)
t ≈ √0.408
t ≈ 0.639 seconds (rounded to three decimal places)

Step 2: Calculate the horizontal distance
The horizontal distance traveled by the ball is given as 20 m.

Step 3: Calculate the velocity
To find the velocity (v) at which the ball was thrown, divide the horizontal distance (d) by the time of flight (t):
v = d/t
v = 20 m / 0.639 s
v ≈ 31.3 m/s (rounded to one decimal place)

Therefore, you threw the ball with an approximate speed of 31.3 m/s.