The infield of a baseball field is a square with side length of 90 feet. A player who hits a home run travels a total of 360 feet around the bases. If an infielder at second base throws the ball to the catcher at home plate, how far does the ball travel? To answer this question, you need to use the Pythagorean Theorem. In this lesson, you will learn about the Pythagorean Theorem and how to apply it to problems such as this one.

Agrhhh, the ball does not go back around the field!

sqrt (90^2 + 90^2) = sqrt(16200) = 127.28

The Pythagorean Theorem states that the sum of the squares of the two legs of a right triangle is equal to the square of the hypotenuse. In this problem, the hypotenuse is the distance the ball travels from second base to home plate. The two legs of the triangle are the distances from second base to first base and from first base to home plate.

Using the Pythagorean Theorem, we can calculate the distance the ball travels from second base to home plate. The two legs of the triangle are 90 feet and 270 feet (360 feet - 90 feet). The hypotenuse is the distance the ball travels from second base to home plate.

Using the Pythagorean Theorem, we can calculate the hypotenuse as follows:

Hypotenuse = √(90^2 + 270^2)

Hypotenuse = √(8100 + 72900)

Hypotenuse = √81000

Hypotenuse = 300 feet

Therefore, the ball travels a total of 300 feet from second base to home plate.