why can't a^2 + b^2 be factored? I seriously don't know how this works out and my text book is no help. Please help.

the terms a and b are being added.

Since they are being added, and aren't the same variables, they cannot be factored.

The factors are (a + ib)(a - ib), where i is the square root of -1, an imaginary number.

The factors are not "real" numbers.

When dealing with an expression like a^2 + b^2, it is indeed not possible to factor it further using real numbers. This is because a^2 + b^2 is in the form of a sum of squares, which cannot be factored using real numbers.

To understand why this is the case, let's consider the Pythagorean theorem. The Pythagorean theorem states that in a right-angled triangle, the square of the length of the hypotenuse is equal to the sum of the squares of the other two sides.

Now, let's think of a^2 + b^2 as a^2 represents the length of one side of a right-angled triangle, and b^2 represents the length of the other side. If we were to try and factor a^2 + b^2, it would imply that we can find two numbers, let's say x and y, such that a^2 + b^2 = (a + x)(a + y) or (a + x)(b + y), for example.

However, if a^2 + b^2 could be factored like this, it would mean that the sum of squares (a^2 + b^2) could be expressed as a product of two binomials. But this contradicts what we know from the Pythagorean theorem, where the sum of squares always represents the square of the hypotenuse. Thus, we cannot factor a^2 + b^2 into simpler terms using real numbers.

It's worth noting that a^2 + b^2 can be factored in other mathematical contexts using complex numbers or trigonometric identities, but that is beyond the scope of factoring with real numbers.