Posted by **kim** on Sunday, December 3, 2006 at 6:13pm.

Having a lil problem

Prove that the roots of

ax^2 + (a + b)x+b are real for all values of k

note the "x"s aren't multiplication signs.

a x^2 + bx + c has the discriminant of

D = b^2 - 4ac.

If D is nonnegative then the function has real roots.

In this case you have

D = (a + b)^2 - 4 a b = (a-b)^2

which is larger than or equal to zero because it is a square.

## Answer this Question

## Related Questions

- Math - The equation x^2 + k = 6 x has two distinct real roots. Find the range of...
- MATH,HELP - Can someone show me how to even do this problem. Find all positive ...
- Precalculus - "Show that x^6 - 7x^3 - 8 = 0 has a quadratic form. Then find the ...
- maths - if a, b,c are real numbers and not all equal, prove that the equation (c...
- maths Please help!!!!! - if a, b,c are real numbers and not all equal, prove ...
- Math (Algebra) - For each degree 17 polynomial f with real coefficients, let sf ...
- algebra - if a quadratic equation with real coefficents has a discriminant of 10...
- math - If the roots of the eqn x²-2cx+ab=0 are real and unequal, then prove that...
- Algebra II - Which describes the number and type of roots of the equation x^2 -...
- math - 3x^2 kx-k=3x prove that this equation has rational roots for all ration ...

More Related Questions