Symmetric operator on V has a basis of orthonormal eigenvectors

Dependencies:

  1. Symmetric operator
  2. Basis of a vector space
  3. Eigenvalues and Eigenvectors
  4. Orthogonality and orthonormality
  5. Matrix of linear transformation
  6. Every complex matrix has an eigenvalue
  7. All eigenvalues of a symmetric operator are real
  8. Real matrix with real eigenvalues has real eigenvectors
  9. A set of dim(V) linearly independent vectors is a basis
  10. Linearly independent set can be expanded into a basis
  11. Gram-Schmidt Process
  12. Incrementing a linearly independent set
  13. Inner product is anti-linear in second argument
  14. A set of mutually orthogonal vectors is linearly independent

Let $V$ be an inner product space on field $F$ where $F$ is either $\mathbb{R}$ or $\mathbb{C}$. Let $L: V \mapsto V$ be a symmetric operator.

Then there is a basis of $V$ consisting of orthonormal eigenvectors of $V$ with real eigenvalues.

Proof

We will prove by induction over $\dim(V)$.

Let $P(n)$ be the predicate that a vector space of dimension $n$ has an orthonormal basis of $n$ eigenvectors of $L$ over any symmetric operator $L$ and the corresponding eigenvalues are real.

Base case

A vector space of dimension 0 is the vector space $\{0\}$. Its basis is $\{\}$. Therefore, $P(0)$ holds.

Inductive step

Let $\dim(V) = n \ge 1$ and assume $P(n-1)$ holds.

Any finite-dimensional vector space with non-0 dimension has a matrix associated with it. Let the matrix of $L$ be $A$. The eigenvalue-eigenvector pairs of $A$ can be mapped to eigenvalue-eigenvector pairs of $L$.

Since every complex matrix has an eigenvalue, $A$ has an eigenvalue-eigenvector pair. Therefore, $L$ has an eigenvalue-eigenvector pair $(\lambda, u)$. Since all eigenvectors of a symmetric operator are real, $\lambda \in \mathbb{R}$.

If $F = \mathbb{R}$, $A$ is a real matrix. Since $\lambda$ is real, the corresponding eigenvector $x$ can be chosen to be real. Therefore, $x \in F^n$ and a corresponding eigenvector $u$ of $L$ exists in $V$.

Since $u \neq 0$, and $\left(\lambda, \frac{u}{\|u\|}\right)$ is also an eigenvalue-eigenvector pair, we can assume without loss of generality that $\|u\|^2 = 1$.

Orthonormal basis of $V$

Since $u \neq 0$, $\{u\}$ is linearly independent. This means $\{u\}$ can be expanded into a basis of $V$. Let $S = [u, u_1, u_2, \ldots, u_{n-1}]$ be a basis of $V$. Therefore, $|S| = \dim(V)$.

By applying the Gram-Schmidt process on $S$, we can find an orthonormal basis $B$ of $V$, where the first vector in $B$ is the same as the first vector in $S$. Therefore, without loss of generality, we can assume that $S$ is an orthonormal basis of $V$.

Dimension of orthogonal complement of $V$

Let $W = \{v \in V: \langle v, u \rangle = 0 \}$.

Since $S$ is linearly independent, $S - \{u\}$ is also linearly independent. Since $S$ is orthogonal, all vectors in $S - \{u\}$ are orthogonal to $u$. Therefore, $S - \{u\} \in W$.

Since $S - \{u\}$ is a linearly independent subset of $W$, it can be expanded to a basis of $W$. Therefore, $|S - \{u\}| \le \dim(W) \Rightarrow \dim(V)-1 \le \dim(W)$.

Let $B$ be a basis of $W$. That means $|B| = \dim(W)$. $\langle u, u \rangle = 1 \Rightarrow u \not\in W = \operatorname{span}(B)$. Since $B$ is linearly independent and $u$ is not a linear combination of $B$, $B \cup \{u\}$ is linearly independent. Since $B \cup \{u\}$ is a linearly independent subset of $V$, it can be expanded into a basis for $V$. Therefore, $|B \cup \{u\}| \le \dim(V) \Rightarrow \dim(W) \le \dim(V)-1$.

$\dim(V)-1 \le \dim(W)$ and $\dim(W) \le \dim(V) - 1$ implies $\dim(W) = \dim(V) - 1$.

Orthonormal basis of eigenvectors of $W$

\[ w \in W \iff \langle w, u \rangle = 0 \]

\begin{align} & \langle L(w), u \rangle \\ &= \langle w, L(u) \rangle \tag{$L$ is symmetric} \\ &= \langle w, \lambda u \rangle \tag{$u$ is an eigenvector for $L$} \\ &= \overline{\lambda} \langle w, u \rangle \tag{by anti-linearity of second argument} \\ &= \overline{\lambda} 0 = 0 \\ &\Rightarrow L(w) \in W \end{align}

Theorefore, $L$ is a symmetric operator on $W$.

Since $\dim(W) = n-1$ and $L: W \mapsto W$ is a symmetric operator, $P(n-1)$ implies that $W$ has an orthonormal basis of eigenvectors of $L$ where all eigenvalues are real. Let $B = [v_1, v_2, \ldots, v_{n-1}]$ be such a basis of $W$.

Conclusion

$v_i \in W \Rightarrow \langle v_i, u \rangle = 0$. Therefore, $B \cup \{u\}$ is an orthonormal set of eigenvectors of $L$.

Since an orthogonal set is linearly independent, $B \cup \{u\}$ is linearly independent. Since a linearly independent set of size $\dim(V)$ is a basis of $V$, $B \cup \{u\}$ is a basis of $V$.

Dependency for:

  1. Orthogonally diagonalizable iff hermitian

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /complex-numbers/conjugation-is-homomorphic
  4. /complex-numbers/complex-numbers
  5. /linear-algebra/eigenvectors/cayley-hamilton-theorem
  6. /misc/fundamental-theorem-of-algebra
  7. /sets-and-relations/composition-of-bijections-is-a-bijection
  8. /sets-and-relations/equivalence-relation
  9. Group
  10. Ring
  11. Polynomial
  12. Integral Domain
  13. Comparing coefficients of a polynomial with disjoint variables
  14. 0x = 0 = x0
  15. Field
  16. Vector Space
  17. Linear independence
  18. Span
  19. Incrementing a linearly independent set
  20. Linear transformation
  21. Composition of linear transformations
  22. Vector space isomorphism is an equivalence relation
  23. Inner product space
  24. Inner product is anti-linear in second argument
  25. Orthogonality and orthonormality
  26. Gram-Schmidt Process
  27. A set of mutually orthogonal vectors is linearly independent
  28. Symmetric operator
  29. A field is an integral domain
  30. Semiring
  31. Matrix
  32. Stacking
  33. System of linear equations
  34. Product of stacked matrices
  35. Matrix multiplication is associative
  36. Reduced Row Echelon Form (RREF)
  37. Conjugation of matrices is homomorphic
  38. Submatrix
  39. Determinant
  40. Determinant of upper triangular matrix
  41. Swapping last 2 rows of a matrix negates its determinant
  42. Matrices over a field form a vector space
  43. Row space
  44. Elementary row operation
  45. Determinant after elementary row operation
  46. Every elementary row operation has a unique inverse
  47. Row equivalence of matrices
  48. Row equivalent matrices have the same row space
  49. RREF is unique
  50. Identity matrix
  51. Inverse of a matrix
  52. Inverse of product
  53. Elementary row operation is matrix pre-multiplication
  54. Row equivalence matrix
  55. Equations with row equivalent matrices have the same solution set
  56. Basis of a vector space
  57. Linearly independent set is not bigger than a span
  58. Homogeneous linear equations with more variables than equations
  59. Rank of a homogenous system of linear equations
  60. Rank of a matrix
  61. A set of dim(V) linearly independent vectors is a basis
  62. Basis of F^n
  63. Matrix of linear transformation
  64. Coordinatization over a basis
  65. Basis changer
  66. Basis change is an isomorphic linear transformation
  67. Vector spaces are isomorphic iff their dimensions are same
  68. Canonical decomposition of a linear transformation
  69. Eigenvalues and Eigenvectors
  70. All eigenvalues of a symmetric operator are real
  71. Real matrix with real eigenvalues has real eigenvectors
  72. Linearly independent set can be expanded into a basis
  73. Full-rank square matrix in RREF is the identity matrix
  74. A matrix is full-rank iff its determinant is non-0
  75. Characteristic polynomial of a matrix
  76. Degree and monicness of a characteristic polynomial
  77. Full-rank square matrix is invertible
  78. AB = I implies BA = I
  79. Determinant of product is product of determinants
  80. Every complex matrix has an eigenvalue