a boat travel 5 miles due south and then 11 miles due east. how far is it from the starting point?

To find the distance from the starting point, we can use the Pythagorean theorem. According to the given information, the boat travels 5 miles due south and then 11 miles due east.

To illustrate this, imagine a right-angled triangle, with one side measuring 5 miles (representing the distance traveled due south) and the other side measuring 11 miles (representing the distance traveled due east). The hypotenuse of the triangle represents the direct distance from the starting point to the final position of the boat.

Using the Pythagorean theorem:

c^2 = a^2 + b^2

where c is the length of the hypotenuse, and a and b are the lengths of the other two sides.

In this case, a = 5 miles and b = 11 miles, so we can substitute these values into the formula:

c^2 = 5^2 + 11^2
= 25 + 121
= 146

To find the length of c (the hypotenuse), we take the square root of both sides:

c = √146
c ≈ 12.083

Therefore, the boat is approximately 12.083 miles from the starting point.

to find distance, use the distance formula:

d = sqrt[(y2-y1)^2 + (x2-x1)^2]
but first, we look for the points,, let the starting point at the origin (0,0) -- this is (x1,y1)
then it traveled 5 miles south (0,-5)
and finally, it traveled 11 miles east, thus (11,-5) -- this point is (x2,y2)

therefore,
d = sqrt[(-5-0)^2 + (11-0)^2]
d = sqrt(25 + 121)
d = sqrt(146) = 12.083

so there,, :)