A rectangular football field is 64 meters wide and 100 meters long. A player runs from one corner of the field in a diagonal line to the opposite corner.

How far did the player run?

Using the Pythagorean Theorem, we can calculate the distance the player ran.

The Pythagorean Theorem states that for a right triangle (which the diagonal line forms with the edges of the rectangular field), the square of the hypotenuse (the longest side) is equal to the sum of the squares of the other two sides.

So, in this case, the distance the player ran (the hypotenuse) is equal to the square root of (64^2 + 100^2).

This simplifies to the square root of (4096 + 10000), which is the square root of 14096.

Rounding to the nearest tenth, the player ran approximately 118.7 meters.