11. Runner A crosses the starting line of a marathon and runs at an average pace of 5.6 miles per hour. Half an hour later, Runner B crosses the starting line and runs at an average rate of 6.4 miles per hour. If the length of the marathon is 26.2 miles, which runner will finish ahead of the other? Explain

runner A runs 5.6 miles/hr so it will take him 4.68 hrs to run the marathon.

runner B runs 6.4 miles/hr so it will take 4.09 hrs to run the marathon

however runner B started half and hour later than runner A 4.68 - .5 = 4.18

so runner B will finish the marathon .09hrs before runner A

To determine which runner will finish ahead of the other, we need to calculate their respective finishing times.

Let's start by calculating the time it would take for each runner to complete the marathon.

For Runner A:
To find the time taken by Runner A, we divide the distance of 26.2 miles by the average pace of 5.6 miles per hour.
Time taken by Runner A = Distance / Speed = 26.2 miles / 5.6 miles per hour

For Runner B:
Runner B starts half an hour after Runner A. So we need to subtract 0.5 hours from the time taken by Runner A to find Runner B's finishing time.
Time taken by Runner B = Time taken by Runner A - 0.5 hours

Now, let's do the calculations:

Time taken by Runner A = 26.2 miles / 5.6 miles per hour = 4.679 hours (approximately)
Time taken by Runner B = 4.679 hours - 0.5 hours = 4.179 hours (approximately)

Comparing the finishing times, we can see that Runner B finishes ahead of Runner A because Runner B's finishing time of approximately 4.179 hours is less than Runner A's finishing time of approximately 4.679 hours.

Therefore, Runner B will finish ahead of Runner A.

running time of faster runner = 26.2/6.4 or 4.125 hours or 4:7.5

running time of slower runner = 26.2/5.6 or 4.67857 hours or 4:40.7

remember the faster runner started 30 minutes later, so what do you think?