Ryan is batting a baseball, he presently can cover the 90 foot distance with the average speed of 20 feet per second

What is the average speed in miles per hour?

To find the average speed in miles per hour, we need to convert the distance from feet to miles and the time from seconds to hours.

First, let's convert the distance from feet to miles. 1 mile is equal to 5280 feet. Therefore, we divide the distance covered (90 feet) by this conversion factor:

90 feet / 5280 feet/mile = 0.01705 miles

Next, let's convert the time from seconds to hours. There are 60 seconds in a minute and 60 minutes in an hour. Therefore, we divide the time taken (1 second) by the number of seconds in an hour:

1 second / (60 seconds/minute) * (60 minutes/hour) = 0.0002778 hours

Finally, we calculate the average speed by dividing the distance in miles by the time in hours:

Average speed = 0.01705 miles / 0.0002778 hours

Using a calculator, we get:

Average speed ≈ 61.44 miles per hour

So, the average speed of Ryan is approximately 61.44 miles per hour.