As part of their half time show, the members of the band must walk diagonally from one corner of the football field to opposite corner. If the field is 100 yards long and 35 yards wide, how far must the band member's march?

what is sqrt(100^2+35^2) ?

To find the distance the band members must march diagonally from one corner of the football field to the opposite corner, we can use the Pythagorean theorem.

The Pythagorean theorem states that in a right-angled triangle, the square of the length of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the length and width of the field form the legs of a right triangle, and the diagonal is the hypotenuse.

Let's calculate the distance using the Pythagorean theorem:

Step 1: Calculate the square of the length and width of the field.
Length^2 = 100^2 = 10,000
Width^2 = 35^2 = 1,225

Step 2: Add the squares of the length and width.
10,000 + 1,225 = 11,225

Step 3: Calculate the square root of the sum.
√11,225 ≈ 105.9

Therefore, the band members must march approximately 105.9 yards diagonally from one corner of the football field to the opposite corner.