Every complex matrix has an eigenvalue
Dependencies:
- Matrix
- Eigenvalues and Eigenvectors
- Degree and monicness of a characteristic polynomial
- /misc/fundamental-theorem-of-algebra
- /linear-algebra/eigenvectors/cayley-hamilton-theorem
- Determinant of product is product of determinants
- A field is an integral domain
- A matrix is full-rank iff its determinant is non-0
- Rank of a homogenous system of linear equations
Every matrix over $\mathbb{C}$ has at least one eigenvalue.
Proof
Let $A$ be an $n$ by $n$ matrix over $\mathbb{C}$.
Let $p_A(x) = |xI - A|$ be the characteristic polynomial of $A$. $p_A$ is a monic polynomial of degree $n$.
By the fundamental theorem of algebra, \[ p_A(x) = \prod_{i=1}^n (x - a_i) \] where $\forall i, a_i \in \mathbb{C}$.
By the Cayley-Hamilton theorem, $p_A(A) = 0$.
\[ p_A(A) = 0 \implies \prod_{i=1}^n (A-a_iI) = 0 \implies \left|\prod_{i=1}^n (A-a_iI)\right| = 0 \implies \prod_{i=1}^n |A-a_iI| = 0 \]
$\exists i, |A-a_iI| = 0$, because $\mathbb{C}$ is a field and a field has no zero-divisors.
Let $a_i = \lambda$.
\begin{align} & |A-\lambda I| = 0 \\ &\Rightarrow \operatorname{rank}(A-\lambda I) < n \\ &\Rightarrow \exists u \neq 0, (A - \lambda I)u = 0 \\ &\Rightarrow \exists u \neq 0, Au = (\lambda I)u = \lambda u \end{align}
Therefore, $(\lambda, u)$ is an eigenvalue-eigenvector pair for $A$.
Dependency for:
Info:
- Depth: 14
- Number of transitive dependencies: 65
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /linear-algebra/eigenvectors/cayley-hamilton-theorem
- /misc/fundamental-theorem-of-algebra
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- 0x = 0 = x0
- Field
- Vector Space
- Linear independence
- Span
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- A field is an integral domain
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Submatrix
- Determinant
- Determinant of upper triangular matrix
- Swapping last 2 rows of a matrix negates its determinant
- Matrices over a field form a vector space
- Row space
- Elementary row operation
- Determinant after elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors
- Full-rank square matrix in RREF is the identity matrix
- A matrix is full-rank iff its determinant is non-0
- Characteristic polynomial of a matrix
- Degree and monicness of a characteristic polynomial
- Full-rank square matrix is invertible
- AB = I implies BA = I
- Determinant of product is product of determinants