Linearity of expectation for matrices

Dependencies:

  1. Random variable
  2. Linearity of expectation

Let $X_1, X_2, \ldots, X_n$ be random matrices. Then $\newcommand{\E}{\operatorname{E}}$ \[ \E\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n \E(X_i) \] This follows simply from linearity of expectation for real-valued random variables and that $\E(X)_{i, j} = \E(X_{i, j})$.

Let $A$ be a $m$-by-$p$ matrix and $X$ be a random $p$-by-$n$ matrix. Then $\E(AX) = A\E(X)$.

Let $A$ be a $p$-by-$n$ matrix and $X$ be a random $m$-by-$p$ matrix. Then $\E(XA) = \E(X)A$.

The above results are collectively called 'linearity of expectation for matrices'.

Proof

\begin{align} \E(AX)[i, j] &= \E((AX)[i, j]) = \E\left(\sum_{k=1}^p A[i, k]X[k, j]\right) \\ &= \sum_{k=1}^p A[i, k]\E(X)[k, j] \tag{by linearity of expectation} = (A\E(X))[i, j] \end{align}

\begin{align} \E(XA)[i, j] &= \E((XA)[i, j]) = \E\left(\sum_{k=1}^p X[i, k]A[k, j]\right) \\ &= \sum_{k=1}^p \E(X)[i, k]A[k, j] \tag{by linearity of expectation} = (\E(X)A)[i, j] \end{align}

Dependency for:

  1. Cross-covariance matrix
  2. General multivariate normal distribution

Info:

Transitive dependencies:

  1. /analysis/topological-space
  2. /sets-and-relations/countable-set
  3. /sets-and-relations/de-morgan-laws
  4. /measure-theory/linearity-of-lebesgue-integral
  5. /measure-theory/lebesgue-integral
  6. σ-algebra
  7. Generated σ-algebra
  8. Borel algebra
  9. Measurable function
  10. Generators of the real Borel algebra (incomplete)
  11. Measure
  12. σ-algebra is closed under countable intersections
  13. Group
  14. Ring
  15. Field
  16. Vector Space
  17. Probability
  18. Random variable
  19. Expected value of a random variable
  20. Linearity of expectation