Hi, I really need help with these True/False questions:

(a) If three vectors in R^3 are orthonormal then they form a basis in R^3.
(b) If Q is square orthogonal matrix such that Q^2018 = I then Q^2017 = Q^T.
(c) If B is square orthogonal matrix then B^−1 = B^T.
(d) If for some basis {a_1, a_2} for vector b one has b = xa_1 +ya_2 then magnitude(b) = sqrt(x^2 + y^2).
(e) If columns of matrix A are orthogonal then rows of A are independent.
(f) Row operations do not change the determinant of a matrix.
(g) If det(2A) = det(3A) then A is not invertible.
(h) If det(A) = det(A^-1) then A = I.
(i) There are no matrix A such that det A^−1 = 0.

Thanks in advance!

Let's go through these questions step by step:

(a) If three vectors in R^3 are orthonormal then they form a basis in R^3.
True. In R^3, any set of three linearly independent vectors forms a basis, and if they are orthonormal, it means that they are mutually perpendicular and have unit magnitudes, fulfilling the conditions for a basis.

(b) If Q is a square orthogonal matrix such that Q^2018 = I, then Q^2017 = Q^T.
True. An orthogonal matrix has the property that its inverse is equal to its transpose. So, Q^T is the inverse of Q, and hence Q^2017 can be written as (Q^T)^2 * Q, which equals I * Q, and simplifying gives Q^2017 = Q^T.

(c) If B is a square orthogonal matrix, then B^−1 = B^T.
True. As mentioned in the previous answer, an orthogonal matrix has the property that its inverse is equal to its transpose. Therefore, B^−1 = B^T.

(d) If for some basis {a_1, a_2} for vector b one has b = xa_1 + ya_2, then magnitude(b) = sqrt(x^2 + y^2).
True. In general, the magnitude of a vector can be calculated by taking the square root of the sum of the squares of its components. In this case, if b = xa_1 + ya_2, the magnitude of b can be found by taking the square root of (x^2 + y^2).

(e) If the columns of matrix A are orthogonal, then the rows of A are independent.
False. The orthogonal property applies to the columns of a matrix, stating that they are mutually perpendicular. However, it does not necessarily imply independence among the rows.

(f) Row operations do not change the determinant of a matrix.
False. Row operations can affect the determinant of a matrix. Specifically, swapping two rows changes the sign of the determinant, multiplying a row by a scalar multiplies the determinant by the same scalar, and adding a multiple of one row to another does not change the determinant.

(g) If det(2A) = det(3A), then A is not invertible.
False. The determinant of a matrix is related to its invertibility, but if det(2A) = det(3A), it does not necessarily imply that A is not invertible. The determinant might change when multiplying a matrix by a scalar, but it does not directly determine the invertibility of the matrix.

(h) If det(A) = det(A^-1), then A = I.
False. This statement is incorrect. For example, consider a matrix A that has a determinant of -1 and its inverse A^(-1) which also has a determinant of -1. In this case, det(A) = det(A^-1), but A is not equal to I.

(i) There are no matrices A such that det A^−1 = 0.
True. The determinant of the inverse of a matrix cannot be zero because if it were, the matrix itself would not have an inverse. In other words, a matrix whose inverse has a determinant of zero is not invertible.

Sure, I'd be happy to help you with these True/False questions. Let's go through each question, and I'll explain how to determine the answer for each one:

(a) If three vectors in R^3 are orthonormal then they form a basis in R^3.
To answer this question, we need to understand the definition of an orthonormal set of vectors. Three vectors will form an orthonormal basis in R^3 if they are mutually perpendicular (orthogonal) and have a length of 1 (normalized).

To determine if an orthonormal set forms a basis in R^3, we need to check if the vectors span the entire space. Since we are given that the vectors are in R^3, which has three dimensions, and we have three vectors, the set will indeed form a basis. Therefore, the statement is True.

(b) If Q is a square orthogonal matrix such that Q^2018 = I, then Q^2017 = Q^T.
To determine if this statement is true or false, we need to recall some properties of orthogonal matrices. An orthogonal matrix Q satisfies the property that Q^T * Q = I, where Q^T represents the transpose of Q.

In this case, we are given that Q^2018 = I. Now, if we multiply both sides of the equation by Q^T, we get (Q^T * Q) * Q^2016 = Q^T. Simplifying this, we have I * Q^2016 = Q^T, and since I * A = A for any matrix A, we have Q^2016 = Q^T.

Therefore, the statement is True.

(c) If B is a square orthogonal matrix, then B^−1 = B^T.
To determine if this statement is true or false, let's consider the properties of orthogonal matrices. An orthogonal matrix B satisfies the property that B^T * B = I.

In general, the inverse of a square matrix is denoted as A^−1 and satisfies the property that A * A^−1 = I.

From the definition of an orthogonal matrix, we have B^T * B = I. Multiplying both sides of this equation by B^−1, we get (B^T * B) * B^−1 = B^−1. Simplifying this, we have I * B^−1 = B^−1, which means B^−1 = B^T.

Therefore, the statement is True.

(d) If, for some basis {a_1, a_2} for vector b, one has b = xa_1 + ya_2, then magnitude(b) = sqrt(x^2 + y^2).
To determine if this statement is true or false, we need to understand the properties of vectors and their magnitudes.

Given that b can be expressed as b = xa_1 + ya_2, where a_1 and a_2 are basis vectors for b, we can visualize this as finding the components of b along the directions of a_1 and a_2.

In this case, x and y are the coefficients representing these components. To find the magnitude of b, we need to use the Pythagorean theorem, which states that for a right triangle with sides of length x and y, the hypotenuse (magnitude) is given by sqrt(x^2 + y^2).

Therefore, the statement is True.

(e) If the columns of matrix A are orthogonal, then the rows of A are independent.
To determine if this statement is true or false, we need to recall the relationship between the columns and rows of a matrix.

The columns of a matrix represent the basis vectors for the column space of the matrix, while the rows represent the basis vectors for the row space.

If the columns of A are orthogonal, it means that each column is perpendicular (orthogonal) to all the other columns. However, this does not guarantee that the rows of A are independent.

For example, consider a 3x3 matrix where each column is orthogonal. If the rows of this matrix are all equal, then they are linearly dependent, which means they are not independent.

Therefore, the statement is False.

(f) Row operations do not change the determinant of a matrix.
To determine if this statement is true or false, let's consider the properties of row operations and the determinant of a matrix.

Row operations involve adding a multiple of one row to another row, multiplying a row by a scalar, or interchanging two rows. These operations do not change the determinant of a matrix.

To see why, we can think about how these operations affect the elementary row operations used to compute the determinant. Since each row operation corresponds to multiplying the determinant by some constant value, the net effect of all row operations is just a multiplication of the determinant by a constant. Therefore, row operations do not change the determinant.

Therefore, the statement is True.

(g) If det(2A) = det(3A), then A is not invertible.
To determine if this statement is true or false, we need to understand the relationship between the determinant of a matrix and its invertibility.

If the determinant of a matrix A is zero (det(A) = 0), then A is not invertible. So, if det(2A) = det(3A), then it means that multiplying the matrix A by either 2 or 3 results in the same determinant.

However, this does not provide any information about the invertibility of A. It is possible for the determinant to be zero for both cases (2A and 3A), which implies that A is not invertible, or it could be a non-zero value for both cases, in which case A is invertible.

Therefore, the statement is False.

(h) If det(A) = det(A^−1), then A = I.
To determine if this statement is true or false, we need to consider the relationship between the determinant of a matrix and its inverse.

If a matrix A has an inverse A^−1, then the determinant of A^−1 is equal to 1/det(A). In other words, det(A^−1) = 1/det(A).

Given that det(A) = det(A^−1), we have det(A) = 1/det(A), which implies that det(A)^2 = 1. The only way for this equation to hold is if det(A) = ±1.

So, if det(A) = ±1, then A is invertible and has its own inverse, which means A = A^−1. This implies that A is equal to the identity matrix, I.

Therefore, the statement is True.

(i) There are no matrices A such that det(A^−1) = 0.
To determine if this statement is true or false, we need to consider the properties of the determinant and the inverse of a matrix.

If a matrix A is invertible, then it has an inverse A^−1. The determinant of A^−1 is equal to 1/det(A). So, if det(A^−1) = 0, it means that 1/det(A) = 0.

However, division by zero is undefined, and therefore, there is no valid matrix A such that det(A^−1) = 0.

Therefore, the statement is True.

I hope this helps with your True/False questions! Let me know if you have any further questions.