a pitcher threw a baseball clocked at 105 miles per hour. The pitcher’s mound is 60 feet 6 inches from home plate. How long did it take, in seconds, for the ball to travel from the pitcher’s mound to home plate?

105 mi/hr = 154 ft/s

So, it takes

60.5/154 seconds

.3928571 of a second

To calculate the time it took for the baseball to travel from the pitcher's mound to home plate, we can use the formula time = distance / speed.

First, let's convert the distance from feet and inches to just feet. Since there are 12 inches in a foot, we have:

60 feet + 6 inches / 12 = 60.5 feet

Now, we can plug in the values into the formula:

time = 60.5 feet / (105 miles per hour)

To calculate the time in seconds, we need to convert miles per hour to feet per second. Since there are 5280 feet in a mile and 3600 seconds in an hour, we have:

105 miles per hour * 5280 feet per mile / 3600 seconds per hour = 153.33 feet per second (rounded to two decimal places)

Plugging this back into the formula, we get:

time = 60.5 feet / 153.33 feet per second

time ≈ 0.395 seconds

So, it took approximately 0.395 seconds for the baseball to travel from the pitcher's mound to home plate.