If an average catcher takes 1.9 seconds to throw a baseball from home plate to second base on a 90 foot baseball diamond how fast in mph did the catcher throw the ball?

5280 feet = 1 mile

90/5280 = what part of a mile

60 seconds in 1 minute and 60 minutes in 1 hour

divide 1.9/3600 what part of an hour

to get mph, divide the first answer by the second.

To calculate the speed of the baseball thrown by the catcher from home plate to second base, we first need to convert the distance from feet to miles. Then, we can use the formula:

Speed (in mph) = Distance (in miles) / Time (in hours)

Let's start by converting the distance from feet to miles:

Distance = 90 feet / 5280 (feet in a mile)
Distance ≈ 0.01705 miles

Now, let's focus on the time. The catcher takes 1.9 seconds to throw the ball.

Time = 1.9 seconds / 3600 (seconds in an hour)
Time ≈ 0.00053 hours

Now, we have both the distance and time in the required units. Let's calculate the speed:

Speed (in mph) = 0.01705 miles / 0.00053 hours
Speed ≈ 32.08 mph

Therefore, the average speed of the baseball thrown by the catcher from home plate to second base is approximately 32.08 mph.