a bus driver wants to average at least 40 miles per hour on a 70 mile trip. if the first 35 miles takes the driver 1 hour what should be the lowest average speed for the second 35 miles?

how much is $0.35per miles per week

To find the lowest average speed for the second 35 miles, we need to consider the total time available and the distance to be covered.

Let's calculate:

Total distance: 70 miles
Time taken for the first 35 miles: 1 hour

Since the driver wants to average at least 40 miles per hour for the entire trip, we can calculate the maximum time allowed for the remaining 35 miles:

Total time for the trip = Time taken for the first 35 miles + Time available for the remaining 35 miles

Total time for the trip = 1 hour + Time available for the remaining 35 miles

We know that Average speed = Total distance / Total time

To find the lowest average speed for the second 35 miles, we need to satisfy the condition that the average speed for the entire trip should be at least 40 miles per hour. Therefore, the total time for the trip should be less than or equal to 70 miles / 40 miles per hour.

Let's calculate the maximum time allowed for the remaining 35 miles:

Maximum time allowed = 70 miles / 40 miles per hour

Maximum time allowed = 1.75 hours

Since the driver has already taken 1 hour for the first 35 miles, the remaining time available for the second 35 miles would be:

Remaining time = Maximum time allowed - Time taken for the first 35 miles

Remaining time = 1.75 hours - 1 hour

Remaining time = 0.75 hours

To calculate the lowest average speed for the second 35 miles, we use the formula:

Average speed = Distance / Time

Average speed = 35 miles / 0.75 hours

Average speed = 46.67 miles per hour

Therefore, the lowest average speed the driver needs to maintain for the second 35 miles to average at least 40 miles per hour for the entire trip is approximately 46.67 miles per hour.