3. Suppose A is symmetric positive definite and Q is an orthogonal matrix (square with orthonormal columns). True or false (with a reason or counterexample)?

a) (Q^(T))AQ is a diagonal matrix
b) (Q^(T))AQ is a symmetric positive definite matrix
c) (Q^(T))AQ has the same eigenvalues as A

-------------------

4. Compute (A^(T))A and A(A^(T)), and their eigenvalues, and eigenvectors of length one and calculate the singular value decomposition of the matrix:
[1 1 0]
A = [0 1 1]

any help would be appreciated, thanks!

To determine the correctness of the statements and compute the required values in question 3, and to calculate the eigenvalues, eigenvectors, and singular value decomposition in question 4, follow the steps below:

Question 3:
a) (Q^T)AQ is a diagonal matrix: True.
Explanation: If A is symmetric positive definite and Q is orthogonal, then (Q^T)AQ will be a diagonal matrix. This is because an orthogonal matrix preserves the orthogonality of the columns of A, and since A is symmetric, the resulting matrix will be diagonal.

b) (Q^T)AQ is a symmetric positive definite matrix: True.
Explanation: Since Q is orthogonal and A is symmetric positive definite, (Q^T)AQ will also be symmetric positive definite. The product of any matrix with a symmetric positive definite matrix will result in a symmetric positive definite matrix.

c) (Q^T)AQ has the same eigenvalues as A: True.
Explanation: Since Q is orthogonal, it does not change the eigenvalues of A. Therefore, (Q^T)AQ will have the same eigenvalues as A.

Question 4:
Compute (A^T)A:
[1 1 0] [1 0] [1 1 0] [1 1 0] [1 1]
[1 1] * [0 1] = [1 1] * [1 1] = [2 2]
[0 1] [0 1] [0 0]

Compute A(A^T):
[1 1 0] [1 0 0] [1 1 0] [1 1 0] [1 1]
[0 1 0] * [1 1 0] = [0 1 0] * [1 1] = [2 2]
[0 0 1] [0 0 1] [0 0]

To compute the eigenvalues and eigenvectors, we need to solve the characteristic equation det(A - λI) = 0.
The matrix A is given by:
[1 1 0]
[0 1 1]

Compute (A - λI):
[1 - λ 1 0]
[0 1 - λ 1]

Solving det(A - λI) = 0:
(1 - λ)(1 - λ) - 1(0) = 0
(1 - λ)^2 - 0 = 0
(1 - λ)^2 = 0
λ = 1,

Since the eigenvalue λ = 1 has algebraic multiplicity 2, there will be two linearly independent eigenvectors.

For λ = 1:
(1 - 1)v1 + v2 = 0
0v1 + v2 = 0
v2 = 0

Therefore, one eigenvector corresponding to λ = 1 is [1; 0].

Now, to calculate the singular value decomposition (SVD) of matrix A:
1. Compute A^T * A:
[1 1 0] * [1 1]
[1 1]
[0 0]
The result is the matrix of the dot products of the columns of A.

2. Find the eigenvalues and eigenvectors of A^T * A:
Compute the eigenvalues and eigenvectors as explained above.

3. Compute A * A^T:
[1 1 0] * [1 0 0]
[1 1 0]
The result is the matrix of the dot products of the rows of A.

4. Find the eigenvalues and eigenvectors of A * A^T:
Compute the eigenvalues and eigenvectors as explained above.

5. Calculate the singular value decomposition:
The singular value decomposition of A is given by:
A = U * Σ * V^T

U is a matrix whose columns are the normalized eigenvectors of A * A^T, Σ is a diagonal matrix whose diagonal entries are the square roots of the eigenvalues of either A * A^T or A^T * A (they are the same), and V is a matrix whose columns are the normalized eigenvectors of A^T * A.

Compute U, Σ, and V.