5. A group of children are playing baseball in a small field. Their baseball diamond is a square with sides 14.1 m. When throwing the ball from home plate to second base how far must it go? Round your answer to one decimal place.

14.1√2 = ?

To find the distance from home plate to second base, we need to determine the length of the diagonal of the square diamond. The square has sides of 14.1 m, so we can use the Pythagorean theorem to find the diagonal.

The Pythagorean theorem states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the sides of the square form a right triangle, with the diagonal being the hypotenuse. The two sides of the triangle are equal to the length of the square's sides, so we can call them a and b.

Using the Pythagorean theorem, we have:

a^2 + b^2 = c^2

where c is the length of the diagonal. Plugging in the given values, we have:

14.1^2 + 14.1^2 = c^2

Simplifying the equation, we get:

198.81 + 198.81 = c^2

397.62 = c^2

To solve for c, we take the square root of both sides:

c = √397.62

Calculating the square root, we find:

c ≈ 19.9

Therefore, the distance from home plate to second base is approximately 19.9 meters round to one decimal place.