A plane flies due east for 35 miles and then due south until it is 37 miles from its starting point. How far south did the plane fly?

35 + √(37^2 - 35^2)

To determine how far south the plane flew, we can use the concept of the Pythagorean theorem and basic trigonometry.

First, let's visualize the situation. Assume the starting point is the origin (0,0) on a coordinate plane. The plane flies due east for 35 miles, which means it moves 35 units to the right along the x-axis.

At this point, the plane is located at (35,0). Next, the plane flies due south until it is 37 miles from its starting point. Since the plane is moving directly south, it doesn't change its x-coordinate but only its y-coordinate.

We need to find the y-coordinate of the plane's final position. Using the Pythagorean theorem, we can calculate the distance between the starting point and the final position as follows:

distance^2 = change_in_x^2 + change_in_y^2

37^2 = 35^2 + change_in_y^2

1369 = 1225 + change_in_y^2

change_in_y^2 = 1369 - 1225

change_in_y^2 = 144

change_in_y = sqrt(144)

change_in_y = 12

Hence, the plane flew 12 miles directly south.