A major-league pitcher can throw a baseball in excess of 41 m/s.if a ball is thrown horizontally at this speed,how much height will it drop by the time it reaches a catcher who is 17 m away from the point of release

To solve this problem, we can first find the time it takes for the ball to reach the catcher, and then calculate how much the ball drops vertically during that time.

First, we need to find the time it takes for the ball to travel the 17 m distance. We can use the formula:

\[
\text{distance} = \text{speed} \times \text{time}
\]

Rearranging the formula to solve for time:

\[
\text{time} = \frac{\text{distance}}{\text{speed}}
\]

\[
\text{time} = \frac{17 \, \text{m}}{41 \, \text{m/s}} \approx 0.415 \, \text{s}
\]

Next, we need to calculate how much the ball drops vertically during this time. We can use the formula:

\[
\text{dropped height} = \frac{1}{2} \times \text{acceleration due to gravity} \times \text{time}^2
\]

\[
\text{dropped height} = \frac{1}{2} \times 9.81 \, \text{m/s}^2 \times (0.415 \, \text{s})^2 \approx 0.846 \, \text{m}
\]

Therefore, the ball will drop approximately 0.846 meters by the time it reaches the catcher who is 17 meters away from the point of release.