If A^TA is an invertible matrix, prove that the column vectors of A are linearly independent.
You know that if statement X implies statement Y then that is equivalent to Not(Y) implies Not(X).
You can start by taking the column vectors of to be linearly dependent and then show that A^TA cannot be invertible.
If the column vectors are linearly dependent then the "null space" of A (a.k.a. "kernel") is at least one dimensional. So, there exists a nonzero vector V such that:
A.V = 0 (the null vector)
You can easily understand this. There must exists a linear combination of the column vectors that yields a column vector with all entries zero. If we denote the column vectors by A1, A2, ...
and the linear combination by:
lambda_1A_1 + lambda_2A_2 + ...
Then you see that this linear combination is just:
A.V
where V is the vector
V = (lambda_1,lamba_2,...)
Note that if A is a N by M matrix, then V has M components, but the column vectors of A have N components. The matrix A^TA is a square M by M matrix. Let's apply this matrix to the vector V:
A^TA.V = A^T.(A.V) = 0
This proves that A^TA is not invertible: If a matrix Q is invertible then the equation Q.X = Y always has a unique solution X = Q^(-1)Y. In particular, this must be true is you take Y to be the zero vector 0.
So, if Q is invertible then the equation Q.X = 0 has the unique solution X = 0. So, if you know that a vector V not equal to zero is mapped to zero by Q, then Q cannot be invertible.
beautiful. thanks count iblis!