Variance of a random variable
Dependencies:
Let $X$ be a real random variable. $\newcommand{\E}{\operatorname{E}}$ $\newcommand{\Var}{\operatorname{Var}}$ Then the variance of $X$, denoted as $\Var(X)$, is $\E((X-\E(X))^2)$.
An equivalent definition is $\Var(X) = \E(X^2) - \E(X)^2$.
Proof
\begin{align} \Var(X) &= \E((X-\E(X))^2) \\ &= \E(X^2 - 2\E(X)X + \E(X)^2) \\ &= \E(X^2) - 2\E(X)\E(X) + \E(X)^2 \tag{linearity of expectation} \\ &= \E(X^2) - \E(X)^2 \end{align}
Dependency for:
- Chebyshev's inequality
- Cantelli's inequality
- Variance of sum of independent random variables
- Var(aX + b) = a^2 Var(X)
- Var(Y) = Var(E(Y|X)) + E(Var(Y|X))
- Conditional variance
- |mean - median| ≤ stddev
- Normal distribution
Info:
- Depth: 8
- Number of transitive dependencies: 20
Transitive dependencies:
- /analysis/topological-space
- /sets-and-relations/countable-set
- /sets-and-relations/de-morgan-laws
- /measure-theory/linearity-of-lebesgue-integral
- /measure-theory/lebesgue-integral
- σ-algebra
- Generated σ-algebra
- Borel algebra
- Measurable function
- Generators of the real Borel algebra (incomplete)
- Measure
- σ-algebra is closed under countable intersections
- Group
- Ring
- Field
- Vector Space
- Probability
- Random variable
- Expected value of a random variable
- Linearity of expectation