Solve the problem. Give your answer to the nearest thousandth if necessary.


A long-distance runner runs 3 miles south and then 7 miles east. How far is the runner from the starting point?

That would be the hypotenuse length of a right triangle with side lengths of 3 and 7. sqrt[3^2 + 7^2] = sqrt 58

http://www.google.com/search?q=sqrt(58)%3D&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a

To solve this problem, we can use the Pythagorean theorem, which states that in a right triangle, the square of the length of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the runner runs 3 miles south and 7 miles east, which forms a right triangle. The two legs of the triangle are 3 miles and 7 miles.

To find the distance between the runner's starting point and current position (the hypotenuse), we can use the Pythagorean theorem:

Hypotenuse^2 = Leg1^2 + Leg2^2

Plugging in the values, we get:

Hypotenuse^2 = 3^2 + 7^2
Hypotenuse^2 = 9 + 49
Hypotenuse^2 = 58

To find the length of the hypotenuse (the distance from the starting point), we can take the square root of both sides:

Hypotenuse = √58

Calculating this using a calculator, we find that the square root of 58 is approximately 7.616.

Rounding this to the nearest thousandth, the runner is approximately 7.616 miles away from the starting point.