posted by CONFUSED!!!! on .
When baseball outfielders throw the ball, they usually allow it to take one bounce on the theory that the ball arrives sooner this way. Suppose that after the bounce the ball rebounds at the same angle theta as it had when released, but loses half its speed.
Assuming the ball is alyways thrown with the same initial speed, at what angle should the ball be thrown in order to go the same distance D with one bounce (lower path) as one thrown upward at angle 47.8 with no bounce (upper path?)
the only hint my teacher gave me was: solve this problem symbolically before you plug in numbers.
i don't even know where to start.
Throw a ball up vertically, it stays in the air a long time, but doesn't go far. The energy of the throw concentrated the speed vertically, where it did little horizontal movement.
So when one angles a ball upwards, it is in the air longer, but that means the horizontal velocity is lower.
To get to a distance D, horizontal velocity needs to be maximum. So ignorign friction, the ball thrown horizontally is the way to go, letting it bounce/roll (neglicting friction). But friction is real, so with each bounce one looses energy.
So the question is, loosing 3/4 energy on one bounce (velocity is 1/2 before bounce), is it still a better deal to let it bounce to get the ball there faster.
Look at time to plate, which is determined by horzontal velocity .
i don't know