Orthogonal matrix
Dependencies:
- Product of stacked matrices
- Transpose of stacked matrix
- AB = I implies BA = I
- Orthogonality and orthonormality
- Matrices form an inner-product space
Let $A$ be an $n$ by $n$ matrix over a subfield of $\mathbb{C}$. Let $A^* = \overline{A^T}$ be the conjugate transpose of $A$. Then the following statements are equivalent:
- $AA^* = I$.
- $A^*A = I$.
- Rows of $A$ are orthonormal.
- Columns of $A$ are orthonormal.
If the above conditions are satisfied, $A$ is said to be orthogonal.
Proof
Since a left inverse is also a right inverse, $A^*A = I \iff AA^* = I$. This also means that $A$ is orthogonal iff $A^*$ is orthogonal.
Let $v_1, v_2, \ldots, v_n$ be the columns of $A$.
\[ A^*A = \begin{bmatrix} v_1^* \\ v_2^* \\ \vdots \\ v_n^* \end{bmatrix} \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix} = \begin{bmatrix} v_1^*v_1 & v_1^*v_2 & \cdots & v_1^*v_n \\ v_2^*v_1 & v_2^*v_2 & \cdots & v_2^*v_n \\ \vdots & \vdots & \ddots & \vdots \\ v_n^*v_1 & v_n^*v_2 & \cdots & v_n^*v_n \end{bmatrix} = \begin{bmatrix} \langle v_1, v_1 \rangle & \langle v_2, v_1 \rangle & \cdots & \langle v_n, v_1 \rangle \\ \langle v_1, v_2 \rangle & \langle v_2, v_2 \rangle & \cdots & \langle v_n, v_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle v_1, v_n \rangle & \langle v_2, v_n \rangle & \cdots & \langle v_n, v_n \rangle \end{bmatrix} \]
\[ A^*A = I \iff \langle v_i, v_j \rangle = \begin{cases} 0 & i \neq j \\ 1 & i = j \end{cases} \iff \textrm{rows of } A \textrm{ are orthonormal} \]
\[ \textrm{columns of } A \textrm{ are orthonormal} \iff \textrm{rows of } A^* \textrm{ are orthonormal} \iff (A^*)^*A^* = AA^* = I \]
Dependency for:
- Matrix of orthonormal basis change
- Orthogonally diagonalizable iff hermitian
- Orthonormal basis change matrix
Info:
- Depth: 11
- Number of transitive dependencies: 50
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /complex-numbers/conjugate-product-abs
- /complex-numbers/conjugation-is-homomorphic
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Vector
- Dot-product of vectors
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- Field
- Vector Space
- Linear independence
- Span
- Inner product space
- Orthogonality and orthonormality
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Transpose of stacked matrix
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Transpose of product
- Trace of a matrix
- Matrices over a field form a vector space
- Row space
- Matrices form an inner-product space
- Elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Full-rank square matrix in RREF is the identity matrix
- Full-rank square matrix is invertible
- AB = I implies BA = I