a pitcher threw a baseball clocked at 105 miles per hour. The pitcher’s mound is 60 feet 6 inches from home plate. How long did it take, in seconds, for the ball to travel from the pitcher’s mound to home plate?

see related questions below

To find the time it took for the ball to travel from the pitcher's mound to home plate, we need to convert the distance from feet and inches to just feet, and then use the speed of the baseball.

First, let's convert 60 feet 6 inches to feet. There are 12 inches in a foot, so we can calculate it as:
60 feet + 6 inches / 12 inches/feet = 60.5 feet

Now, we need to convert miles per hour to feet per second. There are 5280 feet in a mile, and 3600 seconds in an hour, so we can calculate it as:
105 miles/hour * 5280 feet/mile / 3600 seconds/hour ≈ 153.33 feet/second

To find the time it took for the ball to travel, we divide the distance by the speed:
Time = Distance / Speed
Time = 60.5 feet / 153.33 feet/second

Calculating this, we find that it took approximately 0.3948 seconds for the ball to travel from the pitcher's mound to home plate.