Eigenvectors of distinct eigenvalues are linearly independent

Dependencies:

  1. Eigenvalues and Eigenvectors
  2. Linear independence
  3. Zeros in vector space

Let $T: V \mapsto V$ be a linear transformation. Let $[v_1, v_2, \ldots, v_n]$ be eigenvectors corresponding to distinct eigenvalues $[\lambda_1, \lambda_2, \ldots, \lambda_n]$. Then $[v_1, v_2, \ldots, v_n]$ are linearly independent.

Proof by induction

Let $S_k = [v_1, v_2, \ldots, v_k]$.

Let $P(k): S_k$ is linearly independent.

We have to prove $P(k)$ for all $0 \le k \le n$.

Base case:

An empty set is linearly independent by definition. Therefore, $P(0)$ holds. Since eigenvectors are non-zero, $S_1$ is linearly independent. Therefore, $P(1)$ holds.

Inductive step:

Assume $P(k)$ holds for $1 \le k \le n$. Therefore, $S_k$ is linearly independent.

Let $\sum_{i=1}^{k+1} a_iv_i = 0$.

\[ T(0) = T(0 + 0) = T(0) + T(0) \implies T(0) = 0 \]

\[ 0 = T(0) = T\left(\sum_{i=1}^{k+1} a_iv_i\right) = \sum_{i=1}^{k+1}a_iT(v_i) = \sum_{i=1}^{k+1}a_i\lambda_iv_i = a_{k+1}\lambda_{k+1}v_{k+1} + \sum_{i=1}^k a_i\lambda_iv_i \]

\[ 0 = \lambda_{k+1}0 = \lambda_{k+1}\left(\sum_{i=1}^{k+1} a_iv_i\right) = \sum_{i=1}^{k+1}a_i\lambda_{k+1}v_i = a_{k+1}\lambda_{k+1}v_{k+1} + \sum_{i=1}^k a_i\lambda_{k+1}v_i \]

Subtracting the above 2 equations, we get:

\[ 0 = \sum_{i=1}^k a_i(\lambda_i - \lambda_{k+1})v_i \]

Since $S_k$ is linearly independent, $\forall i \le k, a_i(\lambda_i - \lambda_{k+1}) = 0$. Since all $\lambda_i$ are distinct, $\forall i \le k, a_i = 0$.

\[ 0 = \sum_{i=1}^{k+1} a_iv_i = a_{k+1}v_{k+1} \]

Since $v_{k+1} \neq 0$ (because eigenvectors are non-zero), $a_{k+1} = 0$.

Since $\forall i \le k+1, a_i = 0$, $S_{k+1}$ is linearly independent.

By the principle of mathematical induction, $S_n$ is linearly independent.

Dependency for: None

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /sets-and-relations/composition-of-bijections-is-a-bijection
  4. /sets-and-relations/equivalence-relation
  5. Group
  6. Ring
  7. Polynomial
  8. Integral Domain
  9. Comparing coefficients of a polynomial with disjoint variables
  10. Field
  11. Vector Space
  12. Linear independence
  13. Zeros in vector space
  14. Span
  15. Linear transformation
  16. Composition of linear transformations
  17. Vector space isomorphism is an equivalence relation
  18. Semiring
  19. Matrix
  20. Stacking
  21. System of linear equations
  22. Product of stacked matrices
  23. Matrix multiplication is associative
  24. Reduced Row Echelon Form (RREF)
  25. Matrices over a field form a vector space
  26. Row space
  27. Elementary row operation
  28. Every elementary row operation has a unique inverse
  29. Row equivalence of matrices
  30. Row equivalent matrices have the same row space
  31. RREF is unique
  32. Identity matrix
  33. Inverse of a matrix
  34. Inverse of product
  35. Elementary row operation is matrix pre-multiplication
  36. Row equivalence matrix
  37. Equations with row equivalent matrices have the same solution set
  38. Basis of a vector space
  39. Linearly independent set is not bigger than a span
  40. Homogeneous linear equations with more variables than equations
  41. Rank of a homogenous system of linear equations
  42. Rank of a matrix
  43. Basis of F^n
  44. Matrix of linear transformation
  45. Coordinatization over a basis
  46. Basis changer
  47. Basis change is an isomorphic linear transformation
  48. Vector spaces are isomorphic iff their dimensions are same
  49. Canonical decomposition of a linear transformation
  50. Eigenvalues and Eigenvectors