Bounding matrix quadratic form using eigenvalues
Dependencies:
- Matrix
- Eigenvalues and Eigenvectors
- All eigenvalues of a hermitian matrix are real
- Orthogonally diagonalizable iff hermitian
- Matrix multiplication is associative
- Transpose of product
Let $A$ be a $n$ by $n$ real symmetric matrix. Let $\lambda_{\textrm{min}}$ and $\lambda_{\textrm{max}}$ be the minimum and maximum eigenvalues of $A$ and $v_{\textrm{min}}$ and $v_{\textrm{max}}$ be the corresponding eigenvectors. Then \[ \forall u \in \mathbb{R}^n, \lambda_{\textrm{min}}u^Tu \le u^TAu \le \lambda_{\textrm{max}}u^Tu \] Also, \[ v_{\textrm{min}}^TAv_{\textrm{min}} = \lambda_{\textrm{min}}v_{\textrm{min}}^Tv_{\textrm{min}} \bigwedge v_{\textrm{max}}^TAv_{\textrm{max}} = \lambda_{\textrm{max}}v_{\textrm{max}}^Tv_{\textrm{max}} \] so the above bound is tight.
Proof
\[ v_{\textrm{min}}^TAv_{\textrm{min}} = v_{\textrm{min}}^T(\lambda_{\textrm{min}}v_{\textrm{min}}) = \lambda_{\textrm{min}}v_{\textrm{min}}^Tv_{\textrm{min}} \] \[ v_{\textrm{max}}^TAv_{\textrm{max}} = v_{\textrm{max}}^T(\lambda_{\textrm{max}}v_{\textrm{max}}) = \lambda_{\textrm{max}}v_{\textrm{max}}^Tv_{\textrm{max}} \]
Since $A$ is real and symmetric, it is orthogonally diagonalizable. Specifically, let $[v_1, v_2, \ldots, v_n]$ be $n$ orthonormal eigenvectors of $A$ and let $[\lambda_1, \lambda_2, \ldots, \lambda_n]$ be the corresponding eigenvalues. Let $P$ be the matrix whose $i^{\textrm{th}}$ column is $v_i$ and let $D$ be a diagonal matrix whose $i^{\textrm{th}}$ diagonal entry is $\lambda_i$. Then $P$ and $D$ are real and $A = PDP^T$ and $P^TP = I$.
Let $v = P^Tu$. Then \[ v^Tv = (P^Tu)^T(P^Tu) = u^T(PP^T)u = u^Tu \] \[ u^TAu = u^T(PDP^T)u = (P^Tu)^TD(P^Tu) = v^TDv = \sum_{i=1}^n \lambda_i v_i^2 \]
Without loss of generality, assume that $\lambda_1 \ge \lambda_2 \ge \ldots \ge \lambda_n$. \begin{align} & \forall i, \lambda_n \le \lambda_i \le \lambda_1 \\ &\Rightarrow \sum_{i=1}^n \lambda_n v_i^2 \le \sum_{i=1}^n \lambda_i v_i^2 \le \sum_{i=1}^n \lambda_1 v_i^2 \\ &\Rightarrow \lambda_n v^Tv \le \sum_{i=1}^n \lambda_i v_i^2 \le \lambda_1 v^Tv \\ &\Rightarrow \lambda_n u^Tu \le u^TAu \le \lambda_1 u^Tu \end{align}
Dependency for:
Info:
- Depth: 17
- Number of transitive dependencies: 94
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /complex-numbers/conjugate-product-abs
- /complex-numbers/conjugation-is-homomorphic
- /complex-numbers/complex-numbers
- /linear-algebra/eigenvectors/cayley-hamilton-theorem
- /misc/fundamental-theorem-of-algebra
- /sets-and-relations/composition-of-bijections-is-a-bijection
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Vector
- Dot-product of vectors
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- 0x = 0 = x0
- Field
- Vector Space
- Linear independence
- Span
- Incrementing a linearly independent set
- Linear transformation
- Composition of linear transformations
- Vector space isomorphism is an equivalence relation
- Inner product space
- Inner product is anti-linear in second argument
- Orthogonality and orthonormality
- Gram-Schmidt Process
- A set of mutually orthogonal vectors is linearly independent
- Symmetric operator
- A field is an integral domain
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Transpose of stacked matrix
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Conjugate Transpose and Hermitian
- Transpose of product
- Conjugation of matrices is homomorphic
- Submatrix
- Determinant
- Determinant of upper triangular matrix
- Swapping last 2 rows of a matrix negates its determinant
- Trace of a matrix
- Matrices over a field form a vector space
- Row space
- Matrices form an inner-product space
- Elementary row operation
- Determinant after elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- A set of dim(V) linearly independent vectors is a basis
- Basis of F^n
- Matrix of linear transformation
- Coordinatization over a basis
- Basis changer
- Basis change is an isomorphic linear transformation
- Vector spaces are isomorphic iff their dimensions are same
- Canonical decomposition of a linear transformation
- Eigenvalues and Eigenvectors
- All eigenvalues of a hermitian matrix are real
- All eigenvalues of a symmetric operator are real
- Real matrix with real eigenvalues has real eigenvectors
- Diagonalization
- Symmetric operator iff hermitian
- Linearly independent set can be expanded into a basis
- Full-rank square matrix in RREF is the identity matrix
- A matrix is full-rank iff its determinant is non-0
- Characteristic polynomial of a matrix
- Degree and monicness of a characteristic polynomial
- Full-rank square matrix is invertible
- AB = I implies BA = I
- Determinant of product is product of determinants
- Every complex matrix has an eigenvalue
- Symmetric operator on V has a basis of orthonormal eigenvectors
- Orthogonal matrix
- Orthogonally diagonalizable iff hermitian