Orthogonal matrix

Dependencies:

  1. Product of stacked matrices
  2. Transpose of stacked matrix
  3. AB = I implies BA = I
  4. Orthogonality and orthonormality
  5. Matrices form an inner-product space

Let $A$ be an $n$ by $n$ matrix over a subfield of $\mathbb{C}$. Let $A^* = \overline{A^T}$ be the conjugate transpose of $A$. Then the following statements are equivalent:

If the above conditions are satisfied, $A$ is said to be orthogonal.

Proof

Since a left inverse is also a right inverse, $A^*A = I \iff AA^* = I$. This also means that $A$ is orthogonal iff $A^*$ is orthogonal.

Let $v_1, v_2, \ldots, v_n$ be the columns of $A$.

\[ A^*A = \begin{bmatrix} v_1^* \\ v_2^* \\ \vdots \\ v_n^* \end{bmatrix} \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix} = \begin{bmatrix} v_1^*v_1 & v_1^*v_2 & \cdots & v_1^*v_n \\ v_2^*v_1 & v_2^*v_2 & \cdots & v_2^*v_n \\ \vdots & \vdots & \ddots & \vdots \\ v_n^*v_1 & v_n^*v_2 & \cdots & v_n^*v_n \end{bmatrix} = \begin{bmatrix} \langle v_1, v_1 \rangle & \langle v_2, v_1 \rangle & \cdots & \langle v_n, v_1 \rangle \\ \langle v_1, v_2 \rangle & \langle v_2, v_2 \rangle & \cdots & \langle v_n, v_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle v_1, v_n \rangle & \langle v_2, v_n \rangle & \cdots & \langle v_n, v_n \rangle \end{bmatrix} \]

\[ A^*A = I \iff \langle v_i, v_j \rangle = \begin{cases} 0 & i \neq j \\ 1 & i = j \end{cases} \iff \textrm{rows of } A \textrm{ are orthonormal} \]

\[ \textrm{columns of } A \textrm{ are orthonormal} \iff \textrm{rows of } A^* \textrm{ are orthonormal} \iff (A^*)^*A^* = AA^* = I \]

Dependency for:

  1. Matrix of orthonormal basis change
  2. Orthogonally diagonalizable iff hermitian
  3. Orthonormal basis change matrix

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /complex-numbers/conjugate-product-abs
  4. /complex-numbers/conjugation-is-homomorphic
  5. /sets-and-relations/equivalence-relation
  6. Group
  7. Ring
  8. Polynomial
  9. Vector
  10. Dot-product of vectors
  11. Integral Domain
  12. Comparing coefficients of a polynomial with disjoint variables
  13. Field
  14. Vector Space
  15. Linear independence
  16. Span
  17. Inner product space
  18. Orthogonality and orthonormality
  19. Semiring
  20. Matrix
  21. Stacking
  22. System of linear equations
  23. Product of stacked matrices
  24. Transpose of stacked matrix
  25. Matrix multiplication is associative
  26. Reduced Row Echelon Form (RREF)
  27. Transpose of product
  28. Trace of a matrix
  29. Matrices over a field form a vector space
  30. Row space
  31. Matrices form an inner-product space
  32. Elementary row operation
  33. Every elementary row operation has a unique inverse
  34. Row equivalence of matrices
  35. Row equivalent matrices have the same row space
  36. RREF is unique
  37. Identity matrix
  38. Inverse of a matrix
  39. Inverse of product
  40. Elementary row operation is matrix pre-multiplication
  41. Row equivalence matrix
  42. Equations with row equivalent matrices have the same solution set
  43. Rank of a homogenous system of linear equations
  44. Rank of a matrix
  45. Basis of a vector space
  46. Linearly independent set is not bigger than a span
  47. Homogeneous linear equations with more variables than equations
  48. Full-rank square matrix in RREF is the identity matrix
  49. Full-rank square matrix is invertible
  50. AB = I implies BA = I