the time t that it takes for a salesman to drive a certain distance d varies inversely as the average speed r . it takes the salesman 4.75 h to travel between two cities at 60 mi/h . how long would the drive take ?

I think is 5.7

To solve this problem, we need to use the concept of inverse variation. Inverse variation states that two quantities are inversely proportional to each other.

In this case, the time it takes (t) to travel a certain distance (d) varies inversely with the average speed (r). Mathematically, this can be represented as:

t ∝ 1/r

To find the equation that relates these variables, we introduce a constant of variation, k:

t = k/r

To find the value of k, we can use the given information. We know that when the salesman travels between the two cities at an average speed of 60 miles per hour, it takes 4.75 hours. Substituting these values into the equation:

4.75 = k/60

To solve for k, we multiply both sides of the equation by 60:

4.75 * 60 = k

k = 285

Now that we have the value of k, we can use it to find the time it would take to travel the given distance at a different average speed. Let's assume the new average speed is x miles per hour. The equation becomes:

t = 285/x

To find the time, we need to know the distance (d) being traveled. Assuming the distance is still the same, we can substitute the given average speed of 60 miles per hour and solve for time (t):

t = 285/60

t = 4.75 hours

So, it would take the same amount of time, 4.75 hours, to travel the given distance regardless of the average speed.

It takes 4.75 hours.

Looks like an error in the wording of
the prob.

how do you get 4.75