A jogger runs a distance of six miles at an average speed of four miles an hour, how many minutes did this take him?

To find out how many minutes it took the jogger to run a distance of six miles at an average speed of four miles per hour, we need to convert the speed from miles per hour to miles per minute. This can be done by dividing the speed (in miles per hour) by 60 (the number of minutes in an hour).

Speed in miles per minute = Speed in miles per hour / 60

Let's calculate the speed in miles per minute:
Speed in miles per minute = 4 miles per hour / 60
Speed in miles per minute = 0.0667 miles per minute

Now, to find the time it took the jogger to run six miles, we can divide the distance (in miles) by the speed (in miles per minute).

Time in minutes = Distance in miles / Speed in miles per minute

Time in minutes = 6 miles / 0.0667 miles per minute
Time in minutes ≈ 90 minutes

Therefore, it took the jogger approximately 90 minutes to run a distance of six miles at an average speed of four miles per hour.

d = r*t.

6 = 4 * t.
t = Time in hours.
Convert to min.