Orthonormal basis change matrix
Dependencies:
Let $U = [u_1, u_2, \ldots, u_m]$ be an orthonormal basis for an inner-product space $\mathcal{S}$ over the field $F$. Let $V = [v_1, v_2, \ldots, v_m]$ be another orthonormal basis for $\mathcal{S}$.
Since $U$ is a basis, every vector in $\mathcal{S}$, including those in $V$, can be represented as a linear combination of $U$. Therefore, let $A$ be an $m$-by-$m$ matrix on $F$ such that \[ \forall i, v_i = \sum_{j=1}^m A[i, j] u_j \] Such an $A$ is called a basis change matrix from $V$ to $U$ (because pre-multiplying the coordinates of a vector w.r.t. $V$ by $A^T$ would give the coordinates w.r.t. $U$).
I'll prove that $A$ is orthogonal.
Proof
\begin{align} I[i, j] &= \langle v_i, v_j \rangle \tag{$\because$ $V$ is orthonormal} \\ &= \left\langle \sum_{p=1}^m A[i, p] u_p , \sum_{q=1}^m A[j, q] u_q \right\rangle \\ &= \sum_{p=1}^m \sum_{q=1}^m A[i, p] A[j, q] \langle u_p, u_q \rangle \tag{linearity of $\mathcal{S}$} \\ &= \sum_{p=1}^m \sum_{q=1}^m A[i, p] I[p, q] A[j, q] \tag{$\because$ $U$ is orthonormal} \\ &= (AIA^T)[i, j] = (AA^T)[i, j] \end{align}
Therefore, $AA^T = I$. Since $A$ is a square matrix, $A$ is orthogonal.
Dependency for:
Info:
- Depth: 12
- Number of transitive dependencies: 51
Transitive dependencies:
- /linear-algebra/vector-spaces/condition-for-subspace
- /linear-algebra/matrices/gauss-jordan-algo
- /complex-numbers/conjugate-product-abs
- /complex-numbers/conjugation-is-homomorphic
- /sets-and-relations/equivalence-relation
- Group
- Ring
- Polynomial
- Vector
- Dot-product of vectors
- Integral Domain
- Comparing coefficients of a polynomial with disjoint variables
- Field
- Vector Space
- Linear independence
- Span
- Inner product space
- Orthogonality and orthonormality
- Semiring
- Matrix
- Stacking
- System of linear equations
- Product of stacked matrices
- Transpose of stacked matrix
- Matrix multiplication is associative
- Reduced Row Echelon Form (RREF)
- Transpose of product
- Trace of a matrix
- Matrices over a field form a vector space
- Row space
- Matrices form an inner-product space
- Elementary row operation
- Every elementary row operation has a unique inverse
- Row equivalence of matrices
- Row equivalent matrices have the same row space
- RREF is unique
- Identity matrix
- Inverse of a matrix
- Inverse of product
- Elementary row operation is matrix pre-multiplication
- Row equivalence matrix
- Equations with row equivalent matrices have the same solution set
- Basis of a vector space
- Linearly independent set is not bigger than a span
- Homogeneous linear equations with more variables than equations
- Rank of a homogenous system of linear equations
- Rank of a matrix
- Full-rank square matrix in RREF is the identity matrix
- Full-rank square matrix is invertible
- AB = I implies BA = I
- Orthogonal matrix