Covariance of 2 random variables

Dependencies:

  1. Random variable
  2. Expected value of a random variable
  3. Linearity of expectation

Let $X$ and $Y$ be real random variables. Then the covariance of $X$ and $Y$ is defined to be $\newcommand{\E}{\operatorname{E}}$ \[ \operatorname{Cov}(X, Y) = \E((X - \E(X))(Y - \E(Y))) \] An alternative definition is \[ \operatorname{Cov}(X, Y) = \E(XY) - \E(X)\E(Y) \] These definitions are equivalent.

Proof

\begin{align} & \E((X-\E(X))(Y - \E(Y))) \\ &= \E(XY - \E(X)Y - \E(Y)X + \E(X)\E(Y)) \\ &= \E(XY) - \E(X)\E(Y) - \E(Y)\E(X) + \E(X)\E(Y) \tag{linearity of expectation} \\ &= \E(XY) - \E(X)\E(Y) \end{align}

Dependency for:

  1. Cross-covariance matrix

Info:

Transitive dependencies:

  1. /analysis/topological-space
  2. /sets-and-relations/countable-set
  3. /sets-and-relations/de-morgan-laws
  4. /measure-theory/linearity-of-lebesgue-integral
  5. /measure-theory/lebesgue-integral
  6. σ-algebra
  7. Generated σ-algebra
  8. Borel algebra
  9. Measurable function
  10. Generators of the real Borel algebra (incomplete)
  11. Measure
  12. σ-algebra is closed under countable intersections
  13. Group
  14. Ring
  15. Field
  16. Vector Space
  17. Probability
  18. Random variable
  19. Expected value of a random variable
  20. Linearity of expectation