Initially, a baseball was held by one pitcher.

The fastest measured pitched baseball left the pitcher’s hand at a speed of 45m/s. If the pitcher was in contact with the ball over a

distance of 1.5m and produced constant acceleration,

a)What acceleration did he give the ball?

b)How much time did it take him to pitch it?

To find the acceleration given to the ball, we can use the following formula:

v² = u² + 2as

where v is the final velocity, u is the initial velocity, a is the acceleration, and s is the distance covered.

Given:
Initial velocity (u) = 0 m/s (since the ball starts from rest in the pitcher's hand)
Final velocity (v) = 45 m/s
Distance covered (s) = 1.5 m

Substituting these values into the formula, we can solve for acceleration (a):

45² = 0² + 2a(1.5)
2025 = 3a
a = 2025 / 3
a ≈ 675 m/s²

Therefore, the acceleration given to the ball is approximately 675 m/s².

To find the time it took for the ball to be pitched, we can use another formula:

v = u + at

Rearranging the formula, we get:

t = (v - u) / a

Substituting the values:

t = (45 - 0) / 675
t ≈ 0.0667 seconds

Therefore, it took approximately 0.0667 seconds for the pitcher to pitch the ball.