In school we're learning to use the quadratic formula. I understand how to use it and what its applications are, but I don't understand why sometime it's okay to simplify and sometimes it's not. For example, I got the result (-2 ± 2 square root of 6)/2. According to the back of my book, this is the correct answer. I would normally just divide the whole thing by two and therefore be left with - 1 ± square root of 6, but this gives me a different answer. So, my question is, when do I reduce and are there rules to it?

When working with the quadratic formula, it's important to understand when it is appropriate to simplify the expression and when it is not. In this case, you correctly obtained the answer (-2 ± 2√6)/2. Now, let's discuss when and how to simplify this expression.

To decide whether to simplify a square root expression like √a, we usually look for perfect square factors in the radicand (the number inside the square root). In this case, the radicand is 6.

Since 6 does not have any perfect square factors, you cannot simplify it further. So, it's not possible to simplify the expression (-2 ± 2√6)/2 by dividing the entire fraction by 2.

However, you can simplify the expression by simplifying the fraction itself. Dividing both the numerator and denominator by the greatest common divisor (in this case, 2), you get:

(-2/2) ± (2√6/2) = -1 ± √6.

So, the expression (-2 ± 2√6)/2 can indeed be simplified to -1 ± √6. And this is the correct simplified form of the solution.

In summary, when using the quadratic formula, you typically simplify the fractions representing the roots by dividing both the numerator and denominator by their greatest common divisor. However, you generally do not simplify the square root expressions unless the radicand has perfect square factors that can be simplified.