In baseball, a pitcher can accelerate a 0.15 kg ball from rest to 99 mi/h in a distance of 1.4 m.

What is the average force exerted on the ball during the pitch?

To find the average force exerted on the ball, we need to use Newton's second law of motion, which states that the force exerted on an object is equal to the mass of the object multiplied by its acceleration. We first need to convert the given velocity from miles per hour to meters per second, as the other measurements are given in meters.

1 mile = 1609.34 meters (approximate conversion)
1 hour = 3600 seconds

Therefore, 99 mi/h = (99 * 1609.34) m / (1 * 3600) s.

Now we can calculate the velocity in meters per second:

V = (99 * 1609.34) m / (1 * 3600) s = 44.274 m/s (rounded to three decimal places)

Next, we need to calculate the acceleration. We can use the kinematic equation:

V^2 = U^2 + 2as

Where:
V = final velocity (44.274 m/s)
U = initial velocity (0 m/s, since the ball starts from rest)
a = acceleration (unknown)
s = distance (1.4 m)

Rearranging the equation, we have:

a = (V^2 - U^2) / (2s)

Now we can substitute the values:

a = (44.274^2 - 0^2) / (2 * 1.4) = 2174.072 m/s^2 (rounded to three decimal places)

Finally, we can calculate the force using Newton's second law:

F = ma

Substituting the values:

F = 0.15 kg * 2174.072 m/s^2 = 326.1108 N (rounded to three decimal places)

Therefore, the average force exerted on the ball during the pitch is approximately 326.111 Newtons.