A marathon covers 26 miles and 385 yards. If a runner averages 1.8 meters per second, how long in hrs will it take to complete the marathon?

A marathon is 42.195 km

= 42,195 m

time = 42195/1.8 = 23441.666..
seconds
= 6.5116 hrs or 6 hours and 31 minutes

or
1.8 m/s
= 64800 m/h
= 6.48 km/h

rate = 42.195/6.48 hours = 6.5116 hrs as above

or

1.8 m/s
= 180 cm/s
= 70.866.. inches/s
= 5.90551... ft/s ,(stored in calculator's memory)

26 miles and 385 yards
= 26(5280) + 3(385) ft
= 138435 ft

rate = 138435/5.90551.. = 23441.66 seconds
= 23441.66/3600 hours
= 6.5116 hrs, same as above

(Editorial: of course most serious runners would only think in metric and this kind of conversion problem does not exist for them)

To find the time it takes to complete the marathon, we need to convert the distance into a unit that matches the runner's speed. Since the runner's speed is given in meters per second, we need to convert the distance from miles and yards to meters.

First, let's convert 26 miles to meters. 1 mile is approximately equal to 1609.34 meters. So, 26 miles will be:

26 miles * 1609.34 meters/mile ≈ 41841.84 meters

Next, let's convert 385 yards to meters. 1 yard is equal to 0.9144 meters. So, 385 yards will be:

385 yards * 0.9144 meters/yard = 352.04 meters

Now we can add up these two distances to get the total distance in meters:

41841.84 meters + 352.04 meters = 42193.88 meters

The runner's average speed is 1.8 meters per second. To find the time it takes to complete the marathon, we divide the total distance by the runner's speed:

42193.88 meters ÷ 1.8 meters/second = 23441.04 seconds

To convert seconds to hours, we divide by 3600 (since there are 3600 seconds in an hour):

23441.04 seconds ÷ 3600 seconds/hour ≈ 6.5117 hours

Therefore, it will take approximately 6.5117 hours to complete the marathon.