Post a New Question


posted by .

Show that if x is a nonzero column vector in R^n, then the nxn matrix:

A = I - 2/||x||^2 * xx^T
is orthogonal.

Notation key:
||x|| = norm of x
x^T = transpose of x
I = identity matrix.

Let me try to convince a math student to use "physics" notations that many mathematicians don't like.

Let's work with the elements of matrices and use a notation for that instead of the matrix itself. What we do in physics is we attach indices to the matrix. We write:


to denote the matrix element in the i-th row and j-th column (starting from the upper left corner go i steps down and j steps to the right).

The unit matrix is denoted by the Kronecker delta symbol:


which is zero if i is not equal to j and is one if they are equal.

Finally we use the Einstein summation convention for repeated indices.If
C = A B, you can write this as:

C_{i,j} = A_{i,k}B_{k,j}

the repeated index k is summed over.

You can write the square of the norm of x as the inner product with itself, so:

||x||^2 = x_{k}x_{k}

The matrix A in this problem can be written in the "index notation" as:

A_{i,j} =
delta_{i,j}-2/||x||^2 x_{i}x_{j}

You see that we now don't need that stupid T symbol on the x. We don't need to indicate that the index j runs from left to right so that you would have to rotate the column vector x so that it lies on his side.

For an orthogonal matrix the matrix product with its transpose is the unit matrix. The transpose of A is A itself in this case, as you easily see by interchanging i with j:
A_{i,j} = A_{j,i}.

In the general case you need to show that:

A_{i,k}A_{j,k} = delta_{i,j}

In our case A_{j,k} = A_{k,j} so this reduces to an ordinary matrix product, but that doesn't really matter.

If you work out the product (and sum over the repeated index) you obtain four terms.
The product of the deltas gives you:


The three other terms cancel. You obtain twice a product with a delta and the second term in A_{i,j} which after summation of the repeated index gives you:


The product of the two seciond terms in A gives you


Summation over the repeated index k gives a factor ||x||^2 and this then cancels the other term.

Thanks Count. I finally had time to go through and understand this proof. I now understand, but this wasn't simple. I really appreciate the help.

Respond to this Question

First Name
School Subject
Your Answer

Similar Questions

  1. math

    If A^TA is an invertible matrix, prove that the column vectors of A are linearly independent. You know that if statement X implies statement Y then that is equivalent to Not(Y) implies Not(X). You can start by taking the column vectors …
  2. Math

    How do we know the ith of an invertible matrix B is orthogonal to the jth column of B^-1 , if i is not equal/unequal to j?
  3. Linear Algebra, orthogonal

    The vector v lies in the subspace of R^3 and is spanned by the set B = {u1, u2}. Making use of the fact that the set B is orthogonal, express v in terms of B where, v = 1 -2 -13 B = 1 1 2 , 1 3 -1 v is a matrix and B is a set of 2 …
  4. Math

    Mark each of the following True or False. ___ a. All vectors in an orthogonal basis have length 1. ___ b. A square matrix is orthogonal if its column vectors are orthogonal. ___ c. If A^T is orthogonal, then A is orthogonal. ___ d. …
  5. Linear Algebra

    Knowing u = (4,0,-3), v = (x,3,2) and that the orthogonal projection of v on u is a vector of norm 6, determine x. Thank you
  6. Linear Algebra

    Ok this is the last one I promise! It's from a sample exam and I'm practicing for my finals :) Verify if the following 4 points are consecutive vertices of a parallelogram: A(1,-1,1); B(3,0,2);C(2,3,4);D(0,2,3) (b) Find an orthogonal …
  7. Math

    Knowing u = (4,0,-3), v = (x,3,2) and that the orthogonal projection of v on u is a vector of norm 6, determine x. Thank you
  8. linear algebra-urgent

    1)let w=[3;4] and u=[1;2] a) find the projection p of u onto w. I found this to be p=[1.32;1.76] b) find a scalar k for which the vector kp has a norm that is equal to one. k=?
  9. precalc

    Given a square matrix M, we say that a nonzero vector v is an eigenvector of M if Mv=kv for some real number k. The real number k is called the eigenvalue of v with respect to M. 1. Let v be an eigenvector of the matrix M with eigenvalue …
  10. Linear Algebra

    Hi, I really need help with these True/False questions: (a) If three vectors in R^3 are orthonormal then they form a basis in R^3. (b) If Q is square orthogonal matrix such that Q^2018 = I then Q^2017 = Q^T. (c) If B is square orthogonal …

More Similar Questions

Post a New Question