Eigenpair of power of a matrix
Dependencies:
- Matrix
- Eigenvalues and Eigenvectors
- Identity matrix
- Matrix multiplication is associative
- Inverse of product
If $(\lambda, v)$ is an eigenpair of matrix $A$, then $(\lambda^k, v)$ is an eigenpair of $A^k$, where $k \in \mathbb{Z}$.
Proof
Let $P(k)$ be the claim that $(\lambda^k, v)$ is an eigenpair of $A^k$. We'll prove $P(k)$ for all $k \ge 0$ by mathematical induction.
Base case: $P(1)$ is trivially true. When $k = 0$, $A^k = I$ and $Iv = v = \lambda^0 v$. So $(\lambda^k, v)$ is an eigenpair of $A^k$. Hence $P(0)$.
Inductive step: Assume $P(k)$ is true for $k \ge 1$. \begin{align} & A^{k+1}v = (AA^k)v = A(A^kv) = A(\lambda^k v) \\ &= \lambda^k (Av) = \lambda^k (\lambda v) = \lambda^{k+1}v \end{align} Hence, $P(k+1)$ is true. By mathematical induction, $P(k)$ is true $\forall k \ge 0$.
If $A$ is invertible, then \[ Av = \lambda v \implies v = A^{-1}(\lambda v) = \lambda (A^{-1}v) \implies A^{-1}v = \lambda^{-1}v \]
Since $A^{-k} = (A^k)^{-1}$, $((\lambda^{k})^{-1}, v) = (\lambda^{-k}, v)$ is an eigenpair of $A$.
Dependency for: None
Info:
- Depth: 12
- Number of transitive dependencies: 49
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- Field
- Vector Space
- Linear independence
- Span
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Matrices over a field form a vector space
- Row space
- Elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors