Orthonormal basis change matrix

Dependencies:

  1. Inner product space
  2. Basis of a vector space
  3. Orthogonality and orthonormality
  4. Orthogonal matrix

Let $U = [u_1, u_2, \ldots, u_m]$ be an orthonormal basis for an inner-product space $\mathcal{S}$ over the field $F$. Let $V = [v_1, v_2, \ldots, v_m]$ be another orthonormal basis for $\mathcal{S}$.

Since $U$ is a basis, every vector in $\mathcal{S}$, including those in $V$, can be represented as a linear combination of $U$. Therefore, let $A$ be an $m$-by-$m$ matrix on $F$ such that \[ \forall i, v_i = \sum_{j=1}^m A[i, j] u_j \] Such an $A$ is called a basis change matrix from $V$ to $U$ (because pre-multiplying the coordinates of a vector w.r.t. $V$ by $A^T$ would give the coordinates w.r.t. $U$).

I'll prove that $A$ is orthogonal.

Proof

\begin{align} I[i, j] &= \langle v_i, v_j \rangle \tag{$\because$ $V$ is orthonormal} \\ &= \left\langle \sum_{p=1}^m A[i, p] u_p , \sum_{q=1}^m A[j, q] u_q \right\rangle \\ &= \sum_{p=1}^m \sum_{q=1}^m A[i, p] A[j, q] \langle u_p, u_q \rangle \tag{linearity of $\mathcal{S}$} \\ &= \sum_{p=1}^m \sum_{q=1}^m A[i, p] I[p, q] A[j, q] \tag{$\because$ $U$ is orthonormal} \\ &= (AIA^T)[i, j] = (AA^T)[i, j] \end{align}

Therefore, $AA^T = I$. Since $A$ is a square matrix, $A$ is orthogonal.

Dependency for:

  1. Standard normal random vector on vector space

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /complex-numbers/conjugate-product-abs
  4. /complex-numbers/conjugation-is-homomorphic
  5. /sets-and-relations/equivalence-relation
  6. Group
  7. Ring
  8. Polynomial
  9. Vector
  10. Dot-product of vectors
  11. Integral Domain
  12. Comparing coefficients of a polynomial with disjoint variables
  13. Field
  14. Vector Space
  15. Linear independence
  16. Span
  17. Inner product space
  18. Orthogonality and orthonormality
  19. Semiring
  20. Matrix
  21. Stacking
  22. System of linear equations
  23. Product of stacked matrices
  24. Transpose of stacked matrix
  25. Matrix multiplication is associative
  26. Reduced Row Echelon Form (RREF)
  27. Transpose of product
  28. Trace of a matrix
  29. Matrices over a field form a vector space
  30. Row space
  31. Matrices form an inner-product space
  32. Elementary row operation
  33. Every elementary row operation has a unique inverse
  34. Row equivalence of matrices
  35. Row equivalent matrices have the same row space
  36. RREF is unique
  37. Identity matrix
  38. Inverse of a matrix
  39. Inverse of product
  40. Elementary row operation is matrix pre-multiplication
  41. Row equivalence matrix
  42. Equations with row equivalent matrices have the same solution set
  43. Basis of a vector space
  44. Linearly independent set is not bigger than a span
  45. Homogeneous linear equations with more variables than equations
  46. Rank of a homogenous system of linear equations
  47. Rank of a matrix
  48. Full-rank square matrix in RREF is the identity matrix
  49. Full-rank square matrix is invertible
  50. AB = I implies BA = I
  51. Orthogonal matrix