If two planes leave the same airport at 1:00 PM, how many miles apart will they be at 3:00 PM if one travels directly north at 150 mph and the other travels directly west at 200 mph?

I understand that d = sqrt[300^2 + 400^2]
d = sqrt[250,000]
But im not sure how you get 500 from this point don't remember!! I used a calculator for [300^2 + 400^2] but not sure how to do the rest?

just hit the √ key!

√250000 = 500

Thank you been 8 years and trying to go back to school trying to re learn and or remember thanks a bunch!!!!

To find the distance between the two planes at 3:00 PM, you have correctly calculated the distance using the Pythagorean theorem:

d = sqrt[300^2 + 400^2]
d = sqrt[250,000]

Now, to find the distance, you need to evaluate the square root of 250,000, which is approximately 500.

So, the two planes will be approximately 500 miles apart at 3:00 PM.

To find the distance between the two planes at 3:00 PM, you can use the Pythagorean theorem. Let's break it down step by step:

1. The first plane is traveling north at 150 mph for 2 hours from 1:00 PM to 3:00 PM. So, it covers a distance of 150 mph * 2 hours = 300 miles.

2. The second plane is traveling west at 200 mph for 2 hours from 1:00 PM to 3:00 PM. So, it covers a distance of 200 mph * 2 hours = 400 miles.

3. We now have two sides of a right triangle: one side measuring 300 miles (the northward distance) and the other measuring 400 miles (the westward distance).

4. According to the Pythagorean theorem, the square of the hypotenuse (the distance between the two planes) is equal to the sum of the squares of the other two sides. So,

d^2 = 300^2 + 400^2

d^2 = 90,000 + 160,000

d^2 = 250,000

5. To find the value of d (the distance between the two planes), we take the square root:

d = √250,000

d ≈ 500

Therefore, the two planes will be approximately 500 miles apart at 3:00 PM.