rachel allows herself 1 hour to reach a sales appt 50 miles away. after she has driven 30 miles, she realizes that she must increase her speed by 15mph in order to get there on time. what was her speed for the first 30 miles?

let her speed for the first leg be x mph

let her speed for the second leg be x+15 mph
so her time for the first leg is 30/x
her time for the second leg is 20/(x+15)

but 30/x + 20/(x+15) = 1
multiplying both sides by x(x+15) and simplifying I got
x^2 + 35x - 450 = 0
(x+10)(x-45) = 0
x = -10, which is silly or
x = 45 mph

But wouldnt that actually factor into

(x-10)(x+45)

which would imply that the positive value for x is actually 10.

plugging that back into the problem, that means she can go 30 miles at 10 miles per hour and 20 miles at 25 miles per hour to arrive at 50 miles in an hour?

how can she go 30 miles at 10 miles per hour? the speed doesnt seem correct here.

Is it possible that the equation should have been set up as:

30miles(xmiles/per 1 hour) + 20miles (x+15/per 1 hour) = ?

Im stuck

12

To determine Rachel's speed for the first 30 miles, we can use the concept of average speed.

Average speed is calculated by dividing the total distance traveled by the total time taken.

In this case, Rachel has to cover a distance of 50 miles and has 1 hour in total. However, since she has already driven 30 miles, she has only 20 miles left to cover.

Let's denote the speed for the first 30 miles as "x" mph. To find the speed for the remaining 20 miles, we can calculate it by adding 15 mph to the speed for the first 30 miles. So, the speed for the remaining 20 miles would be "x + 15" mph.

Since average speed is the total distance divided by the total time taken, we can set up the following equation:

Total distance / Total time = Average speed

(30 miles + 20 miles) / 1 hour = x mph

Simplifying this equation, we get:

50 miles / 1 hour = x mph

x = 50 mph

Therefore, Rachel's speed for the first 30 miles was 50 mph.