Eigenvectors of distinct eigenvalues are linearly independent
Dependencies:
Let $T: V \mapsto V$ be a linear transformation. Let $[v_1, v_2, \ldots, v_n]$ be eigenvectors corresponding to distinct eigenvalues $[\lambda_1, \lambda_2, \ldots, \lambda_n]$. Then $[v_1, v_2, \ldots, v_n]$ are linearly independent.
Proof by induction
Let $S_k = [v_1, v_2, \ldots, v_k]$.
Let $P(k): S_k$ is linearly independent.
We have to prove $P(k)$ for all $0 \le k \le n$.
Base case:
An empty set is linearly independent by definition. Therefore, $P(0)$ holds. Since eigenvectors are non-zero, $S_1$ is linearly independent. Therefore, $P(1)$ holds.
Inductive step:
Assume $P(k)$ holds for $1 \le k \le n$. Therefore, $S_k$ is linearly independent.
Let $\sum_{i=1}^{k+1} a_iv_i = 0$.
\[ T(0) = T(0 + 0) = T(0) + T(0) \implies T(0) = 0 \]
\[ 0 = T(0) = T\left(\sum_{i=1}^{k+1} a_iv_i\right) = \sum_{i=1}^{k+1}a_iT(v_i) = \sum_{i=1}^{k+1}a_i\lambda_iv_i = a_{k+1}\lambda_{k+1}v_{k+1} + \sum_{i=1}^k a_i\lambda_iv_i \]
\[ 0 = \lambda_{k+1}0 = \lambda_{k+1}\left(\sum_{i=1}^{k+1} a_iv_i\right) = \sum_{i=1}^{k+1}a_i\lambda_{k+1}v_i = a_{k+1}\lambda_{k+1}v_{k+1} + \sum_{i=1}^k a_i\lambda_{k+1}v_i \]
Subtracting the above 2 equations, we get:
\[ 0 = \sum_{i=1}^k a_i(\lambda_i - \lambda_{k+1})v_i \]
Since $S_k$ is linearly independent, $\forall i \le k, a_i(\lambda_i - \lambda_{k+1}) = 0$. Since all $\lambda_i$ are distinct, $\forall i \le k, a_i = 0$.
\[ 0 = \sum_{i=1}^{k+1} a_iv_i = a_{k+1}v_{k+1} \]
Since $v_{k+1} \neq 0$ (because eigenvectors are non-zero), $a_{k+1} = 0$.
Since $\forall i \le k+1, a_i = 0$, $S_{k+1}$ is linearly independent.
By the principle of mathematical induction, $S_n$ is linearly independent.
Dependency for: None
Info:
- Depth: 12
- Number of transitive dependencies: 50
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- Field
- Vector Space
- Linear independence
- Zeros in vector space
- Span
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Matrices over a field form a vector space
- Row space
- Elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors