A is diagonalizable iff there are n linearly independent eigenvectors
Dependencies:
- Diagonalization
- Linear independence
- Inverse of a matrix
- Transpose of product
- Full-rank square matrix is invertible
- A matrix is full-rank iff its rows are linearly independent
Let $A$ be an $n$ by $n$ matrix. Then $L$ is diagonalizable iff there are $n$ linearly independent eigenvectors for $A$.
Proof
\begin{align} & P^{-1} \textrm{ exists} \\ &\iff \exists Q, QP = PQ = I \\ &\iff \exists Q, P^TQ^T = Q^TP^T = I \\ &\iff (P^T)^{-1} \textrm{ exists} \\ &\iff \operatorname{rank}(P^T) = n \\ &\iff \textrm{Rows of } P^T \textrm{ are linearly independent} \\ &\iff \textrm{Columns of } P \textrm{ are linearly independent} \end{align}
$(\exists P, \exists \textrm{ diagonal } D, AP = PD) \iff$ columns of $P$ are eigenvectors of $A$.
If there are $n$ linearly independent eigenvectors, make them the columns of $P$. Then $AP = PD$ ($D$ is diagonal) and $P^{-1}$ exists, so $D = P^{-1}AP$. Therefore, $A$ is diagonalizable.
If $A$ is diagonalizable, there is a $P$ such that $P^{-1}$ exists and $AP = PD$ ($D$ is diagonal). Therefore, columns of $P$ are linearly independent and they are eigenvectors of $A$. Therefore, $A$ has $n$ linearly independent eigenvectors.
Dependency for: None
Info:
- Depth: 13
- Number of transitive dependencies: 57
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- Field
- Vector Space
- Linear independence
- Span
- Decrementing a span
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Rows of RREF are linearly independent
- Transpose of product
- Matrices over a field form a vector space
- Row space
- Elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Spanning set of size dim(V) is a basis
- A matrix is full-rank iff its rows are linearly independent
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors
- Diagonalization
- Full-rank square matrix in RREF is the identity matrix
- Full-rank square matrix is invertible