Matrix of linear transformation

Dependencies:

  1. Matrix
  2. Linear transformation
  3. Basis of F^n
  4. Comparing coefficients of a polynomial with disjoint variables

Let $F$ be a field. Let $T: F^m \mapsto F^n$ be a linear transformation. Then $T$ can be expressed as pre-multiplication by a unique $n$ by $m$ matrix. Formally, \[ \exists A \in \mathbb{M}_{m, n}(F), \forall u \in U, T(u) = A^Tu \] where $u$ is treated as a column vector of length $m$ and $A$ is unique.

Conversely, if $A$ is an $m$ by $n$ matrix, $T(u) = Au$ is a linear transformation from $F^n$ to $F^m$.

Proof

Let $E = \{e_1 = (1, 0, \ldots, 0), e_2 = (0, 1, \ldots, 0), \ldots, e_m = (0, 0, \ldots, 1) \}$ be a basis of $F^m$.

Let $u \in F^m$ where $u = (u_1, u_2, \ldots, u_m) = \sum_{i=1}^m u_ie_i$.

\[ T(u) = T\left(\sum_{i=1}^m u_ie_i\right) = \sum_{i=1}^m u_iT(e_i) \]

$T(e_i) \in F^n$. Therefore, let $A$ be an $m$ by $n$ matrix where $A[i, j] = T(e_i)_j$.

\[ T(u)_j = \sum_{i=1}^m u_iT(e_i)_j = \sum_{i=1}^m u_iA[i, j] = \sum_{i=1}^m A^T[j, i]u[i, 1] = (A^Tu)_j \]

Therefore, $T(u) = A^Tu$.

Let $T(u) = A^Tu = B^Tu$ for all $u \in U$.

\[ (\forall u \in U, A^Tu = B^Tu) \implies (\forall u \in U, (A-B)^Tu = 0) \]

\[ ((A-B)^Tu)_i = \sum_{j=1}^n (A-B)^T[i, j]u_j = 0 \]

Comparing the coefficient of $u_j$, we get $(A-B)^T[i, j] = 0$. This means $(A-B)^T = 0 \Rightarrow A = B$.

Therefore, $T$ has a unique matrix associated with it.

Conversely, let $A$ be an $m$ by $n$ matrix. Let $T(u) = Au$.

\[ T(u + v) = A(u + v) = Au + Av = T(u) + T(v) \] \[ T(cv) = A(cv) = c(Av) = cT(v) \]

Therefore, $T$ is a linear transformation.

Dependency for:

  1. Matrix of orthonormal basis change
  2. Symmetric operator iff hermitian
  3. Eigenvalues and Eigenvectors
  4. Symmetric operator on V has a basis of orthonormal eigenvectors

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /sets-and-relations/equivalence-relation
  4. Group
  5. Ring
  6. Polynomial
  7. Integral Domain
  8. Comparing coefficients of a polynomial with disjoint variables
  9. Field
  10. Vector Space
  11. Linear independence
  12. Span
  13. Linear transformation
  14. Semiring
  15. Matrix
  16. Stacking
  17. System of linear equations
  18. Product of stacked matrices
  19. Matrix multiplication is associative
  20. Reduced Row Echelon Form (RREF)
  21. Matrices over a field form a vector space
  22. Row space
  23. Elementary row operation
  24. Every elementary row operation has a unique inverse
  25. Row equivalence of matrices
  26. Row equivalent matrices have the same row space
  27. RREF is unique
  28. Identity matrix
  29. Inverse of a matrix
  30. Inverse of product
  31. Elementary row operation is matrix pre-multiplication
  32. Row equivalence matrix
  33. Equations with row equivalent matrices have the same solution set
  34. Basis of a vector space
  35. Linearly independent set is not bigger than a span
  36. Homogeneous linear equations with more variables than equations
  37. Rank of a homogenous system of linear equations
  38. Rank of a matrix
  39. Basis of F^n