Posted by mathstudent on .
Prove that if A is a symmetric n x n matrix, then A has a set of n orthonormal eigenvectors.
I've read the entire page and while it's on the correct topic, it doesn't prove what I'm looking to prove.
I think you want proof of the "completeness" property, i.e. that there are n orthonormal eigenvectors, not that if there are two eigenvectors then they must be orthogonal (or in the degenerate case that you can choose orthonormal eigenspaces corresponding to different eigenvalue).
Put differently this means that the eigenvectors span the entire linear space the matrix is acting on.
You can prove this using induction. If an n-by n matrix A has one eigenvector V with eigenvalue lambda, then you consider the linear operator defined as:
A dot x - lambda (V dot x) V
Here x is a vector on which we let the operator act on. A dot x is the action of A on x. The second term is the inner product of x with V we then multiply that by the vector V times lambda. This is just like what the matrix A does with V. By subtracting this term you map V to zero.
This means that the linear operator you obtained maps the orthogonal complement of vectors proportional to V, which is an n-1 dimensional space onto itself.
You can iterate this procedure until you encounter the trivial case of a linear operator acting on a 1 dimensional space.
All you need in this proof is the fact that a symmetric linear operator has at least one eigenvector.
can an inner product space v have a t invariant subspace U but also have an orthogonal complement that is NOT t-invariant???