How much is the acceleration of a baseball thrown by a pitcher in a major league baseball game, if the ball goes from 0 to 90 mph in a one-half-second time period?

V=90mi/h * 1600m/mi * 1h/3600s.= 40 m/s.

a = (V-Vo)/t = (40-0)/0.5 = 80 m/s^2.

80

To determine the acceleration of the baseball, we can use the equation:

Acceleration (a) = Change in velocity (Δv) / Time taken (Δt)

Given that the ball starts from rest (0 mph) and reaches a velocity of 90 mph in a one-half-second (0.5 seconds) time period, we can calculate the change in velocity.

Change in velocity (Δv) = Final velocity (v_f) - Initial velocity (v_i)

Here, the initial velocity (v_i) is 0 mph and the final velocity (v_f) is 90 mph.

Δv = 90 mph - 0 mph = 90 mph

Next, we substitute the values into the formula to find the acceleration:

Acceleration (a) = Δv / Δt

Acceleration (a) = 90 mph / 0.5 seconds

Now, we need to convert the speed from miles per hour to feet per second since we want to find the acceleration in terms of feet per second squared.

To convert from mph to fps:

1 mph = 1.47 fps (approximately)

So, Acceleration (a) = 90 mph * 1.47 fps / 0.5 seconds

Finally, we can calculate the acceleration:

Acceleration (a) = 264.6 fps / 0.5 seconds

Acceleration (a) = 529.2 ft/s^2 (approximately)

Therefore, the acceleration of a baseball thrown by a pitcher in a major league baseball game, if the ball goes from 0 to 90 mph in a one-half-second time period, is approximately 529.2 feet per second squared.