Friday
October 31, 2014

Homework Help: Algebra

Posted by omar on Monday, February 27, 2012 at 1:57am.

Find the roots of the given equation by completing the square.

αx^2 + βx + δ = 0

Answer this Question

First Name:
School Subject:
Answer:

Related Questions

Pure Mathematics - 4) The roots of the equation z^2+2z+4=0 are denoted by α...
Trigonometry - The quadratic equation x^2−9x+10=0 has roots tanα and ...
Math - If α and β are two angles in Quadrant II such that tan α...
math - Complex numbers α and β satisfy αα=ββ=&#...
math - Let α and β be the roots of 3x2+4x+9=0. Then (1+α)(1+β...
Maths - Let α and β be the roots of 3x^2+4x+9=0. Then (1+α)(1+&#...
math - 1. Let α, β be the roots of the equation x2 - px + r = 0 and &#...
math - The roots of the polynomial f(x)=2x3+20x2+201x+2013 are α,β and...
Math - Find the roots of the given equation by completing the square: ax^2+bx+c=...
Pure Mathematics - 1) i) use an algrbraic method to find the square root of 2+i...

Search
Members