Symmetric operator iff hermitian

Dependencies:

  1. Conjugate Transpose and Hermitian
  2. Basis of a vector space
  3. Gram-Schmidt Process
  4. Symmetric operator
  5. Canonical decomposition of a linear transformation
  6. Matrix of linear transformation
  7. Matrices form an inner-product space
  8. Inner product is anti-linear in second argument
  9. Transpose of product

Let $B = [v_1, v_2, \ldots, v_n]$ be an orthonormal basis of $V$, where $V$ is an inner product space of finite dimension $n$ on the field $F$.

Let $L: V \mapsto V$ be a linear transformation. By the canonical decomposition theorem, $L = RTR^{-1}$, where $R: F^n \mapsto V$ and $R([a_1, a_2, \ldots, a_n]) = \sum_{i=1}^n a_iv_i$ and $R$ is an isomorphic linear transformation.

Since every linear transformation from $F^n$ to $F^n$ can be expressed as matrix pre-multiplication, $T(x) = Ax$.

Then $L$ is symmetric iff $A = A^*$ ($A^*$ is the conjugate transpose of $A$).

Conversely, let $A$ be a square matrix. Then $T(u) = Au$ is a symmetric linear operator iff $A = A^*$.

Proof

Define $\langle x, y \rangle = y^*x$ to be the inner product on $F^n$.

Lemma 1: $\langle R(a), R(b) \rangle = \langle a, b \rangle$

Let $a = [a_1, a_2, \ldots, a_n]$ and $b = [b_1, b_2, \ldots, b_n]$.

\begin{align} \langle R(a), R(b) \rangle &= \left\langle \sum_{i=1}^n a_iv_i, \sum_{i=1}^n b_iv_i \right\rangle \\ &= \sum_{i=1}^n \sum_{j=1}^n a_i \overline{b_j} \langle v_i, v_j \rangle \tag{by (anti-)linearity} \\ &= \sum_{i=1}^n a_i \overline{b_i} \tag{$B$ is orthonormal} \\ &= b^*a = \langle a, b \rangle \end{align}

Lemma 2

Let $R(x) = u$ and $R(y) = v$.

\begin{align} & \langle L(u), v \rangle \\ &= \langle R(T(R^{-1}(u))), v \rangle \\ &= \langle R(T(x)), R(y) \rangle \\ &= \langle T(x), y \rangle \tag{by lemma 1} \\ &= \langle Ax, y \rangle \\ &= y^*Ax \end{align}

\begin{align} & \langle u, L(v) \rangle \\ &= \langle u, R(T(R^{-1}(v))) \rangle \\ &= \langle R(x), R(T(y)) \rangle \\ &= \langle x, T(y) \rangle \tag{by lemma 1} \\ &= \langle x, Ay \rangle \\ &= (Ay)^*x = y^*A^*x \end{align}

Lemma 3

Suppose $\forall x, y \in F^n, y^*Ax = 0$.

\[ y^*Ax = \sum_{i=1}^n \sum_{j=1}^n (y^*)[1, i] A[i, j] x[j, 1] = \sum_{i=1}^n \sum_{j=1}^n \overline{y_i} x_j A[i, j] \]

Plugging in $x = e_j$ and $y = e_i$ in the above equation ($e_k$ is a column vector with all entries 0 except the $k^{\textrm{th}}$ entry, which is 1), we get $y^*Ax = A[i, j]$.

Therefore, $(\forall x, y \in F^n, y^*Ax = 0) \iff A = 0$.

Conclusion

\begin{align} & \forall u, v \in V, \langle L(u), v \rangle = \langle u, L(v) \rangle \\ &\iff \forall x, y \in F^n, y^*Ax = y^*A^*x \tag{by lemma 2 and $\because R$ is a bijection} \\ &\iff \forall x, y \in F^n, y^*(A^*-A)x = 0 \\ &\iff A = A^* \tag{by lemma 3} \end{align}

Converse

Let $A$ be an $n$ by $n$ matrix. Then $T(u) = Au$ is a linear transformation.

\[ \langle T(u), v \rangle - \langle u, T(v) \rangle = \langle Au, v \rangle - \langle u, Av \rangle = v^*(Au) - (Av)^*u = v^*Au - v^*A^*u = v^*(A - A^*)u \]

\begin{align} & T \textrm{ is symmetric} \\ &\iff \forall u, v \in F^n, \langle T(u), v \rangle = \langle u, T(v) \rangle \\ &\iff \forall u, v \in F^n, v^*(A - A^*)u = 0 \\ &\iff A = A^* \tag{by lemma 3} \end{align}

Dependency for:

  1. Orthogonally diagonalizable iff hermitian

Info:

Transitive dependencies:

  1. /linear-algebra/vector-spaces/condition-for-subspace
  2. /linear-algebra/matrices/gauss-jordan-algo
  3. /complex-numbers/conjugate-product-abs
  4. /complex-numbers/conjugation-is-homomorphic
  5. /sets-and-relations/composition-of-bijections-is-a-bijection
  6. /sets-and-relations/equivalence-relation
  7. Group
  8. Ring
  9. Polynomial
  10. Vector
  11. Dot-product of vectors
  12. Integral Domain
  13. Comparing coefficients of a polynomial with disjoint variables
  14. Field
  15. Vector Space
  16. Linear independence
  17. Span
  18. Linear transformation
  19. Composition of linear transformations
  20. Vector space isomorphism is an equivalence relation
  21. Inner product space
  22. Inner product is anti-linear in second argument
  23. Orthogonality and orthonormality
  24. Gram-Schmidt Process
  25. Symmetric operator
  26. Semiring
  27. Matrix
  28. Stacking
  29. System of linear equations
  30. Product of stacked matrices
  31. Matrix multiplication is associative
  32. Reduced Row Echelon Form (RREF)
  33. Conjugate Transpose and Hermitian
  34. Transpose of product
  35. Trace of a matrix
  36. Matrices over a field form a vector space
  37. Row space
  38. Matrices form an inner-product space
  39. Elementary row operation
  40. Every elementary row operation has a unique inverse
  41. Row equivalence of matrices
  42. Row equivalent matrices have the same row space
  43. RREF is unique
  44. Identity matrix
  45. Inverse of a matrix
  46. Inverse of product
  47. Elementary row operation is matrix pre-multiplication
  48. Row equivalence matrix
  49. Equations with row equivalent matrices have the same solution set
  50. Basis of a vector space
  51. Linearly independent set is not bigger than a span
  52. Homogeneous linear equations with more variables than equations
  53. Rank of a homogenous system of linear equations
  54. Rank of a matrix
  55. Basis of F^n
  56. Matrix of linear transformation
  57. Coordinatization over a basis
  58. Basis changer
  59. Basis change is an isomorphic linear transformation
  60. Vector spaces are isomorphic iff their dimensions are same
  61. Canonical decomposition of a linear transformation