HELP! I can't figure the formula out for this problem. a pitcher threw a baseball clocked at 90/mph. The pitcher’s mound is 60.5 feet from home plate. How long did it take, in seconds, for the ball to travel from the pitcher’s mound to home plate?

time = distance/rate

90 miles/hour
= 90(5280) ft / 3600 seconds
= 132 ft/s

so time = 60.5 ft/132 ft/s
= .45833... seconds

= appr .46 seconds

I was close! Thank you!

To find the time it took for the baseball to travel from the pitcher's mound to home plate, we can use the formula:

Time = Distance / Speed

In this case, the distance is given as 60.5 feet and the speed is given as 90 miles per hour. However, we need to convert the speed from miles per hour to feet per second since the distance is given in feet.

To convert miles per hour to feet per second, we need to consider that 1 mile is equal to 5280 feet and 1 hour is equal to 3600 seconds.

So, the conversion factor would be:
(90 miles / 1 hour) * (5280 feet / 1 mile) * (1 hour / 3600 seconds) = 132 feet per second

Now, we can substitute the distance and speed values into the formula:
Time = 60.5 feet / 132 feet per second

Dividing 60.5 by 132 gives us approximately 0.46 seconds.

Therefore, it took approximately 0.46 seconds for the baseball to travel from the pitcher's mound to home plate.