The playing surface of a football field is 300 ft long and 160ft wide. If a player runs from one corner of the field to the opposite corner, how many feet does he run

The answer is 340

To find the distance the player runs from one corner to the opposite corner of the football field, we can use the Pythagorean Theorem. The Pythagorean Theorem states that in a right triangle, the square of the length of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the length and width of the football field form the two sides of the right triangle, and the distance the player runs is the hypotenuse.

So, to find the distance, we can calculate the square root of the sum of the squares of the length and width:

Distance = √(Length^2 + Width^2)

Plugging in the values, we get:

Distance = √(300^2 + 160^2)
Distance = √(90000 + 25600)
Distance = √115600
Distance ≈ 340.53 feet

Therefore, the player runs approximately 340.53 feet from one corner to the opposite corner of the football field.

√(300^2 + 160^2) = ___