We are studying pythagorean Theorem. How do I set this up to solve?

A jogger goes half a mile north and then turns west. If the jogger finishes 1.3 mi from the starting point, how far west did the jogger go?

a^2 + b^2 = c^2

0.5^2 + b^2 = 1.3^2

Solve for b to find the distance she went west.

0.5^2+b^2=1.3^2

.25+b^2=1.69
b^2=1.44
b=1.2

1.2mi.

To solve this problem using the Pythagorean theorem, we need to set up a right triangle. The distance the jogger went north will be one leg of the triangle, and the distance the jogger goes west will be the other leg. The distance from the starting point to the ending point will be the hypotenuse of the triangle.

Let's start by drawing a diagram to visualize the situation.

North
^
|
|
|
|
Start | End (1.3 mi)
|
|
|
|
V
West

Since the jogger goes half a mile north and ends up 1.3 miles from the starting point, the north leg of the triangle is 0.5 miles, and the hypotenuse is 1.3 miles.

Now, we can use the Pythagorean theorem, which states that in a right triangle, the square of the length of the hypotenuse is equal to the sum of the squares of the lengths of the other two sides.

In this case, let's call the distance the jogger goes west as 'x.'

So, the Pythagorean theorem can be set up as:

0.5^2 + x^2 = 1.3^2

Now, we can solve this equation to find the value of 'x,' which represents the distance the jogger went west.

0.25 + x^2 = 1.69

Subtracting 0.25 from both sides:

x^2 = 1.69 - 0.25

x^2 = 1.44

Taking the square root of both sides:

x = √1.44

x = 1.2

Therefore, the jogger went 1.2 miles west.