posted by kim .
Having a lil problem
Prove that the roots of
ax^2 + (a + b)x+b are real for all values of k
note the "x"s aren't multiplication signs.
a x^2 + bx + c has the discriminant of
D = b^2 - 4ac.
If D is nonnegative then the function has real roots.
In this case you have
D = (a + b)^2 - 4 a b = (a-b)^2
which is larger than or equal to zero because it is a square.