Solve the problem. Round to the nearest tenth, if necessary.

A long-distance runner runs 5 miles directly south and then faces due east and runs another 7 miles. How far is the runner, in a straight line, from his starting point?

a^2 + b^2 = c^2

5^2 + 7^2 = c^2
25 + 49 = c^2
74 = c^2
8.6 = c

To solve this problem, we can use the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, we can consider the distance the runner traveled south as one side of a right triangle, and the distance the runner traveled east as the other side. We will use the distance from the starting point as the hypotenuse.

Let's represent the distance traveled south as side A and the distance traveled east as side B.

Given that the runner ran 5 miles south (side A) and 7 miles east (side B), we can calculate the distance from the starting point using the Pythagorean theorem.

The formula for the Pythagorean theorem is:

c^2 = a^2 + b^2

where c is the hypotenuse and a and b are the other two sides.

In this case, we want to find the hypotenuse, which represents the distance from the starting point. Let's call it x.

x^2 = 5^2 + 7^2
x^2 = 25 + 49
x^2 = 74

To solve for x, we take the square root of both sides:
x = √74

Now, we can calculate the value of x using a calculator or by estimating the square root of 74. Rounded to the nearest tenth, the distance from the starting point is approximately 8.6 miles.