Symmetric operator on V has a basis of orthonormal eigenvectors
Dependencies:
- Symmetric operator
- Basis of a vector space
- Eigenvalues and Eigenvectors
- Orthogonality and orthonormality
- Matrix of linear transformation
- Every complex matrix has an eigenvalue
- All eigenvalues of a symmetric operator are real
- Real matrix with real eigenvalues has real eigenvectors
- A set of dim(V) linearly independent vectors is a basis
- Linearly independent set can be expanded into a basis
- Gram-Schmidt Process
- Incrementing a linearly independent set
- Inner product is anti-linear in second argument
- A set of mutually orthogonal vectors is linearly independent
Let $V$ be an inner product space on field $F$ where $F$ is either $\mathbb{R}$ or $\mathbb{C}$. Let $L: V \mapsto V$ be a symmetric operator.
Then there is a basis of $V$ consisting of orthonormal eigenvectors of $V$ with real eigenvalues.
Proof
We will prove by induction over $\dim(V)$.
Let $P(n)$ be the predicate that a vector space of dimension $n$ has an orthonormal basis of $n$ eigenvectors of $L$ over any symmetric operator $L$ and the corresponding eigenvalues are real.
Base case
A vector space of dimension 0 is the vector space $\{0\}$. Its basis is $\{\}$. Therefore, $P(0)$ holds.
Inductive step
Let $\dim(V) = n \ge 1$ and assume $P(n-1)$ holds.
Any finite-dimensional vector space with non-0 dimension has a matrix associated with it. Let the matrix of $L$ be $A$. The eigenvalue-eigenvector pairs of $A$ can be mapped to eigenvalue-eigenvector pairs of $L$.
Since every complex matrix has an eigenvalue, $A$ has an eigenvalue-eigenvector pair. Therefore, $L$ has an eigenvalue-eigenvector pair $(\lambda, u)$. Since all eigenvectors of a symmetric operator are real, $\lambda \in \mathbb{R}$.
If $F = \mathbb{R}$, $A$ is a real matrix. Since $\lambda$ is real, the corresponding eigenvector $x$ can be chosen to be real. Therefore, $x \in F^n$ and a corresponding eigenvector $u$ of $L$ exists in $V$.
Since $u \neq 0$, and $\left(\lambda, \frac{u}{\|u\|}\right)$ is also an eigenvalue-eigenvector pair, we can assume without loss of generality that $\|u\|^2 = 1$.
Orthonormal basis of $V$
Since $u \neq 0$, $\{u\}$ is linearly independent. This means $\{u\}$ can be expanded into a basis of $V$. Let $S = [u, u_1, u_2, \ldots, u_{n-1}]$ be a basis of $V$. Therefore, $|S| = \dim(V)$.
By applying the Gram-Schmidt process on $S$, we can find an orthonormal basis $B$ of $V$, where the first vector in $B$ is the same as the first vector in $S$. Therefore, without loss of generality, we can assume that $S$ is an orthonormal basis of $V$.
Dimension of orthogonal complement of $V$
Let $W = \{v \in V: \langle v, u \rangle = 0 \}$.
Since $S$ is linearly independent, $S - \{u\}$ is also linearly independent. Since $S$ is orthogonal, all vectors in $S - \{u\}$ are orthogonal to $u$. Therefore, $S - \{u\} \in W$.
Since $S - \{u\}$ is a linearly independent subset of $W$, it can be expanded to a basis of $W$. Therefore, $|S - \{u\}| \le \dim(W) \Rightarrow \dim(V)-1 \le \dim(W)$.
Let $B$ be a basis of $W$. That means $|B| = \dim(W)$. $\langle u, u \rangle = 1 \Rightarrow u \not\in W = \operatorname{span}(B)$. Since $B$ is linearly independent and $u$ is not a linear combination of $B$, $B \cup \{u\}$ is linearly independent. Since $B \cup \{u\}$ is a linearly independent subset of $V$, it can be expanded into a basis for $V$. Therefore, $|B \cup \{u\}| \le \dim(V) \Rightarrow \dim(W) \le \dim(V)-1$.
$\dim(V)-1 \le \dim(W)$ and $\dim(W) \le \dim(V) - 1$ implies $\dim(W) = \dim(V) - 1$.
Orthonormal basis of eigenvectors of $W$
\[ w \in W \iff \langle w, u \rangle = 0 \]
\begin{align} & \langle L(w), u \rangle \\ &= \langle w, L(u) \rangle \tag{$L$ is symmetric} \\ &= \langle w, \lambda u \rangle \tag{$u$ is an eigenvector for $L$} \\ &= \overline{\lambda} \langle w, u \rangle \tag{by anti-linearity of second argument} \\ &= \overline{\lambda} 0 = 0 \\ &\Rightarrow L(w) \in W \end{align}
Theorefore, $L$ is a symmetric operator on $W$.
Since $\dim(W) = n-1$ and $L: W \mapsto W$ is a symmetric operator, $P(n-1)$ implies that $W$ has an orthonormal basis of eigenvectors of $L$ where all eigenvalues are real. Let $B = [v_1, v_2, \ldots, v_{n-1}]$ be such a basis of $W$.
Conclusion
$v_i \in W \Rightarrow \langle v_i, u \rangle = 0$. Therefore, $B \cup \{u\}$ is an orthonormal set of eigenvectors of $L$.
Since an orthogonal set is linearly independent, $B \cup \{u\}$ is linearly independent. Since a linearly independent set of size $\dim(V)$ is a basis of $V$, $B \cup \{u\}$ is a basis of $V$.
Dependency for:
Info:
- Depth: 15
- Number of transitive dependencies: 80
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /complex-numbers/conjugation-is-homomorphic
- /complex-numbers/complex-numbers
- /linear-algebra/eigenvectors/cayley-hamilton-theorem
- /misc/fundamental-theorem-of-algebra
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- 0x = 0 = x0
- Field
- Vector Space
- Linear independence
- Span
- Incrementing a linearly independent set
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Inner product space
- Inner product is anti-linear in second argument
- Orthogonality and orthonormality
- Gram-Schmidt Process
- A set of mutually orthogonal vectors is linearly independent
- Symmetric operator
- A field is an integral domain
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Conjugation of matrices is homomorphic
- Submatrix
- Determinant
- Determinant of upper triangular matrix
- Swapping last 2 rows of a matrix negates its determinant
- Matrices over a field form a vector space
- Row space
- Elementary row operation
- Determinant after elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- A set of dim(V) linearly independent vectors is a basis
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors
- All eigenvalues of a symmetric operator are real
- Real matrix with real eigenvalues has real eigenvectors
- Linearly independent set can be expanded into a basis
- Full-rank square matrix in RREF is the identity matrix
- A matrix is full-rank iff its determinant is non-0
- Characteristic polynomial of a matrix
- Degree and monicness of a characteristic polynomial
- Full-rank square matrix is invertible
- AB = I implies BA = I
- Determinant of product is product of determinants
- Every complex matrix has an eigenvalue