a catcher takes 1.9 seconds to throw a ball from home base to second base. The baseball diamond is 90 foot. How fast in miles per hour did the catcher throw the ball?

recall

rate = distance / time
rate = 90/1.9 ft/s
= (90/5280) / (1.9/3600) mph
= appr 32.3 mph

32.3 mph

To calculate the speed at which the catcher threw the ball in miles per hour, we need to convert the distance from feet to miles and the time from seconds to hours.

First, let's convert the distance. There are 5280 feet in a mile, so to convert 90 feet to miles, we divide 90 by 5280:

90 feet / 5280 feet/mile = 0.01705 miles

Next, let's convert the time. There are 60 seconds in a minute and 60 minutes in an hour, so to convert 1.9 seconds to hours, we divide 1.9 by 3600:

1.9 seconds / 3600 seconds/hour = 0.000528 hours

Finally, to find the speed, we divide the distance (in miles) by the time (in hours):

Speed = Distance / Time = 0.01705 miles / 0.000528 hours

Using a calculator, the speed is approximately 32.3 miles per hour.

Therefore, the catcher threw the ball at a speed of approximately 32.3 miles per hour.