A police car traveling 100 ft/sec is chasing a car being driven erratically at 75 ft/sec. If the police car is 300 feet behind the other vehicle when the chase began, about how many seconds will it take to catch the erratic driver?

I think I know how to solve this, but I want to confirm it. Someone help? Thank you.

The distance between the cars decreases at a rate of 25 ft/s. Divide 300 feet by that distance-closing rate and you get the answer: 12 seconds

the stopping distance d of a car after the brakes are applied varies directly as the square of the speed r. If a car traveling 40 mph can stop in 90 ft, how many feet will it take the same car to stop when it is traveling 30mph?V

To solve this problem, we can set up an equation based on the relative speeds of the two vehicles.

Let's assume that it will take t seconds for the police car to catch the erratic driver. In that time, the police car will have traveled a distance of 100t feet, while the erratic driver will have traveled a distance of 75t feet.

Since the police car is initially 300 feet behind the erratic driver, we can set up the following equation:

Distance covered by police car = Distance covered by erratic driver + Initial distance between them

100t = 75t + 300

To solve this equation, we can subtract 75t from both sides:

100t - 75t = 300

25t = 300

Finally, we can divide both sides by 25 to solve for t:

t = 300 / 25

t = 12

Therefore, it will take approximately 12 seconds for the police car to catch the erratic driver.