Coordinatization over a basis

Dependencies:

  1. Basis of a vector space

Let $B = [u_1, u_2, \ldots, u_n]$ be a sequence of vectors which form a finite basis on the vector space $V$ over a field $F$. Then every vector $v \in V$ can be expressed uniquely as a linear combination of vectors in $B$.

The sequence of coefficients of the linear combination is denoted by $[v]_B$ and is called the coordinates of $v$ with respect to $B$.

This means that the function $R: F^n \mapsto V$ where $R([a_1, a_2, \ldots, a_n]) = \sum_{i=1}^n a_iu_i$ is a bijection and $R^{-1}(v) = [v]_B$.

If $B$ is infinite-sized, every vector $v \in V$ can be uniquely expressed as a linear combination of a finite subset of $B$ (uniqueness is up to the ordering of elements in $V$).

In this case, coordinates can't be expressed as a sequence. The coordinates of $v$ are expressed by the coordinate function $f_{v, B}: B \mapsto F$.

Proof

Let $v \in V$. Since $B$ spans $V$, every v can be represented as a finite linear combination of vectors in $B$.

Suppose there are 2 such linear combinations, \[ v = \sum_{i=1}^n a_iu_i = \sum_{i=1}^n b_iu_i \] \[ \Rightarrow 0 = v - v = \sum_{i=1}^n (a_i-b_i)u_i \] Since $B$ is linearly independent, $a_i - b_i = 0$ for all $i$.

Therefore, $a_i = b_i$ for all $i$, which implies that there is a unique representation of $v$ as a linear combination of $B$.

Dependency for:

  1. Canonical decomposition of a linear transformation
  2. Basis change is an isomorphic linear transformation
  3. Vector spaces are isomorphic iff their dimensions are same
  4. Basis changer

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /sets-and-relations/equivalence-relation
  4. Group
  5. Ring
  6. Polynomial
  7. Integral Domain
  8. Comparing coefficients of a polynomial with disjoint variables
  9. Field
  10. Vector Space
  11. Linear independence
  12. Span
  13. Semiring
  14. Matrix
  15. Stacking
  16. System of linear equations
  17. Product of stacked matrices
  18. Matrix multiplication is associative
  19. Reduced Row Echelon Form (RREF)
  20. Matrices over a field form a vector space
  21. Row space
  22. Elementary row operation
  23. Every elementary row operation has a unique inverse
  24. Row equivalence of matrices
  25. Row equivalent matrices have the same row space
  26. RREF is unique
  27. Identity matrix
  28. Inverse of a matrix
  29. Inverse of product
  30. Elementary row operation is matrix pre-multiplication
  31. Row equivalence matrix
  32. Equations with row equivalent matrices have the same solution set
  33. Basis of a vector space
  34. Linearly independent set is not bigger than a span
  35. Homogeneous linear equations with more variables than equations
  36. Rank of a homogenous system of linear equations
  37. Rank of a matrix