Every complex matrix has an eigenvalue

Dependencies:

  1. Matrix
  2. Eigenvalues and Eigenvectors
  3. Degree and monicness of a characteristic polynomial
  4. /misc/fundamental-theorem-of-algebra
  5. /linear-algebra/eigenvectors/cayley-hamilton-theorem
  6. Determinant of product is product of determinants
  7. A field is an integral domain
  8. A matrix is full-rank iff its determinant is non-0
  9. Rank of a homogenous system of linear equations

Every matrix over $\mathbb{C}$ has at least one eigenvalue.

Proof

Let $A$ be an $n$ by $n$ matrix over $\mathbb{C}$.

Let $p_A(x) = |xI - A|$ be the characteristic polynomial of $A$. $p_A$ is a monic polynomial of degree $n$.

By the fundamental theorem of algebra, \[ p_A(x) = \prod_{i=1}^n (x - a_i) \] where $\forall i, a_i \in \mathbb{C}$.

By the Cayley-Hamilton theorem, $p_A(A) = 0$.

\[ p_A(A) = 0 \implies \prod_{i=1}^n (A-a_iI) = 0 \implies \left|\prod_{i=1}^n (A-a_iI)\right| = 0 \implies \prod_{i=1}^n |A-a_iI| = 0 \]

$\exists i, |A-a_iI| = 0$, because $\mathbb{C}$ is a field and a field has no zero-divisors.

Let $a_i = \lambda$.

\begin{align} & |A-\lambda I| = 0 \\ &\Rightarrow \operatorname{rank}(A-\lambda I) < n \\ &\Rightarrow \exists u \neq 0, (A - \lambda I)u = 0 \\ &\Rightarrow \exists u \neq 0, Au = (\lambda I)u = \lambda u \end{align}

Therefore, $(\lambda, u)$ is an eigenvalue-eigenvector pair for $A$.

Dependency for:

  1. Symmetric operator on V has a basis of orthonormal eigenvectors

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /linear-algebra/eigenvectors/cayley-hamilton-theorem
  4. /misc/fundamental-theorem-of-algebra
  5. /sets-and-relations/composition-of-bijections-is-a-bijection
  6. /sets-and-relations/equivalence-relation
  7. Group
  8. Ring
  9. Polynomial
  10. Integral Domain
  11. Comparing coefficients of a polynomial with disjoint variables
  12. 0x = 0 = x0
  13. Field
  14. Vector Space
  15. Linear independence
  16. Span
  17. Linear transformation
  18. Composition of linear transformations
  19. Vector space isomorphism is an equivalence relation
  20. A field is an integral domain
  21. Semiring
  22. Matrix
  23. Stacking
  24. System of linear equations
  25. Product of stacked matrices
  26. Matrix multiplication is associative
  27. Reduced Row Echelon Form (RREF)
  28. Submatrix
  29. Determinant
  30. Determinant of upper triangular matrix
  31. Swapping last 2 rows of a matrix negates its determinant
  32. Matrices over a field form a vector space
  33. Row space
  34. Elementary row operation
  35. Determinant after elementary row operation
  36. Every elementary row operation has a unique inverse
  37. Row equivalence of matrices
  38. Row equivalent matrices have the same row space
  39. RREF is unique
  40. Identity matrix
  41. Inverse of a matrix
  42. Inverse of product
  43. Elementary row operation is matrix pre-multiplication
  44. Row equivalence matrix
  45. Equations with row equivalent matrices have the same solution set
  46. Basis of a vector space
  47. Linearly independent set is not bigger than a span
  48. Homogeneous linear equations with more variables than equations
  49. Rank of a homogenous system of linear equations
  50. Rank of a matrix
  51. Basis of F^n
  52. Matrix of linear transformation
  53. Coordinatization over a basis
  54. Basis changer
  55. Basis change is an isomorphic linear transformation
  56. Vector spaces are isomorphic iff their dimensions are same
  57. Canonical decomposition of a linear transformation
  58. Eigenvalues and Eigenvectors
  59. Full-rank square matrix in RREF is the identity matrix
  60. A matrix is full-rank iff its determinant is non-0
  61. Characteristic polynomial of a matrix
  62. Degree and monicness of a characteristic polynomial
  63. Full-rank square matrix is invertible
  64. AB = I implies BA = I
  65. Determinant of product is product of determinants