Orthogonally diagonalizable iff hermitian
Dependencies:
- Conjugate Transpose and Hermitian
- Orthogonal matrix
- Transpose of product
- Symmetric operator iff hermitian
- Symmetric operator on V has a basis of orthonormal eigenvectors
- All eigenvalues of a symmetric operator are real
- Real matrix with real eigenvalues has real eigenvectors
- Diagonalization
Let $A$ be an $n$ by $n$ matrix over $\mathbb{C}$.
$A$ is said to be orthogonally diagonalizable iff $A = PDP^*$, where $P$ is an orthogonal matrix and $D$ is a diagonal matrix with real entries.
We'll prove 2 things:
- $A$ is orthogonally diagonalizable iff $A = A^*$.
- Let $A = A^*$. Let $[v_1, v_2, \ldots, v_n]$ be eigenvectors of $A$ with $[\lambda_1, \lambda_2, \ldots, \lambda_n]$ as the corresponding eigenvalues. Let $P$ be a matrix where the $i^{\textrm{th}}$ column is $v_i$. Then $P$ is orthogonal and $A = PDP^*$. Also, if $A$ is real, then $P$ is real.
Proof of 'only-if' part
Let $A$ be orthogonally diagonalizable.
\[ A = PDP^* \implies A^* = (PDP^*)^* = PD^*P^* = PDP^* = A \]
Proof of 'if' part
Let $A = A^*$. This means that the operator $T(u) = Au$ is a symmetric operator over the vector space $\mathbb{C}^n$.
A symmetric operator on a finite-dimensional vector space $V$ over field $\mathbb{C}$ has $\dim(V)$ orthonormal eigenvectors. Therefore, $A$ has $n$ orthonormal eigenvectors. Let $[v_1, v_2, \ldots, v_n]$ be the eigenvectors and $[\lambda_1, \lambda_2, \ldots, \lambda_n]$ be the corresponding eigenvalues. $\forall i, \lambda_i \in \mathbb{R}$, since all eigenvalues of a symmetric operator are real. If $A$ is real, $[v_1, v_2, \ldots, v_n]$ are real.
Let $P$ be the matrix whose columns are $[v_1, v_2, \ldots, v_n]$. Then $P$ is orthogonal and $P$ is real if $A$ is real. Let $D$ be a diagonal matrix where $D[i, i] = \lambda_i$. Therefore, $AP = PD$, which implies that $A = APP^* = PDP^*$. Therefore, $A$ is orthgonally diagonalizable.
Dependency for:
Info:
- Depth: 16
- Number of transitive dependencies: 92
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /complex-numbers/conjugate-product-abs
- /complex-numbers/conjugation-is-homomorphic
- /complex-numbers/complex-numbers
- /linear-algebra/eigenvectors/cayley-hamilton-theorem
- /misc/fundamental-theorem-of-algebra
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Vector
- Dot-product of vectors
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- 0x = 0 = x0
- Field
- Vector Space
- Linear independence
- Span
- Incrementing a linearly independent set
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Inner product space
- Inner product is anti-linear in second argument
- Orthogonality and orthonormality
- Gram-Schmidt Process
- A set of mutually orthogonal vectors is linearly independent
- Symmetric operator
- A field is an integral domain
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Transpose of stacked matrix
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Conjugate Transpose and Hermitian
- Transpose of product
- Conjugation of matrices is homomorphic
- Submatrix
- Determinant
- Determinant of upper triangular matrix
- Swapping last 2 rows of a matrix negates its determinant
- Trace of a matrix
- Matrices over a field form a vector space
- Row space
- Matrices form an inner-product space
- Elementary row operation
- Determinant after elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- A set of dim(V) linearly independent vectors is a basis
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors
- All eigenvalues of a symmetric operator are real
- Real matrix with real eigenvalues has real eigenvectors
- Diagonalization
- Symmetric operator iff hermitian
- Linearly independent set can be expanded into a basis
- Full-rank square matrix in RREF is the identity matrix
- A matrix is full-rank iff its determinant is non-0
- Characteristic polynomial of a matrix
- Degree and monicness of a characteristic polynomial
- Full-rank square matrix is invertible
- AB = I implies BA = I
- Determinant of product is product of determinants
- Every complex matrix has an eigenvalue
- Symmetric operator on V has a basis of orthonormal eigenvectors
- Orthogonal matrix