Random variable
Dependencies:
- Probability
- Measurable function
- Measure
- Generated σ-algebra
- Borel algebra
- Generators of the real Borel algebra (incomplete)
$\newcommand{\Fcal}{\mathcal{F}}$ $\newcommand{\Ecal}{\mathcal{E}}$ $\newcommand{\Tcal}{\mathcal{T}}$ Let $(\Omega, \Fcal, \Pr)$ be a probability space. Let $\Ecal$ be a $\sigma$-algebra over a set $D$. Then any measurable function $X: \Omega \mapsto D$ is said to be a random variable with support $D$.
For $S \in \Ecal$, define $X^{-1}(S) = \{\omega \in \Omega: X(\omega) \in S\}$ and define $\Pr(X \in S) = \Pr(X^{-1}(S))$. Define $\Pr_X: \Ecal \mapsto [0, 1]$ as $\Pr_X(S) = \Pr(X \in S)$. $\Pr_X$ is called the probability distribution of $X$.
Probability space of probability distribution
Theorem: $(D, \Ecal, \Pr_X)$ is a probability space.
Proof:
- Non-negativity is trivial to prove.
- $X^{-1}(D) = \Omega$, so $\Pr_X(D) = 1$.
- $\Pr_X(\{\}) = \Pr(\{\omega \in \Omega: X(\omega) \in \{\}\}) = P(\{\}) = 0$.
- $\sigma$-additivity: Let $\Tcal$ be a countable set of pairwise-disjoint sets from $D$. Since $X$ is a function, $\Tcal' = \{X^{-1}(A): A \in \Tcal\}$ is also a countable set of pairwise-disjoint sets. Also, $\Tcal' \subseteq \Fcal$. \begin{align} \Pr_X\left(\bigcup_{A \in \Tcal} A\right) &= \Pr\left(X^{-1}\left(\bigcup_{A \in \Tcal} A\right)\right) \\ &= \Pr\left(\bigcup_{A \in \Tcal} X^{-1}(A)\right) \\ &= \sum_{A \in \Tcal} \Pr(X^{-1}(A)) \\ &= \sum_{A \in \Tcal} \Pr_X(A) \tag*{$\square$} \end{align} Therefore, $\Pr_X$ is a probability measure over $(D, \Ecal)$.
Totally-ordered random variables
If $D$ is totally-ordered, then the cumulative distribution function (CDF) $F_X: D \mapsto [0, 1]$ of a random variable $X$ is defined as $F_X(x) = \Pr(X \le x) = \Pr(\{\omega \in \Omega: X(\omega) \le x\})$. Therefore, $F_X$ is a non-decreasing function. It is easy to see that $\Pr(X > x) = 1 - F_X(x)$.
When $D$ is totally ordered, for a sequence $X = [X_1, X_2, \ldots, X_n]$ of random variables, the joint CDF of $X$ is defined as \[ F_X(x_1, x_2, \ldots, x_n) = \Pr(X_1 \le x_1 \cap X_2 \le x_2 \cap \ldots \cap X_n \le x_n) \]
When $\Ecal = \sigma(\{\{y \in D: y \le x\}: x \in D\})$, $F_X$ completely characterizes $\Pr_X$.
Discrete random variables
When $D$ is countable, $X$ is called a discrete random variable.
The probability mass function $f_X: D \mapsto [0, 1]$ of a discrete random variable $X$ is defined as $f_X(x) = \Pr(X = x) = \Pr(\{\omega \in \Omega: X(\omega) = x\})$. The probability mass function is sometimes also called the distribution function.
For a sequence $X = [X_1, X_2, \ldots, X_n]$ of random variables, the joint probability mass function of $X$ is defined as \[ f_X(x_1, x_2, \ldots, x_n) = \Pr(X_1 = x_1 \cap X_2 = x_2 \cap \ldots \cap X_n = x_n) \]
When $\Ecal$ is the power-set of $D$, $f_X$ completely characterizes $\Pr_X$.
Continuous random variables
Let $X$ be a random variable with support $\mathbb{R}$.
Suppose there exists a function $f_X: \mathbb{R} \mapsto \mathbb{R}_{\ge 0}$ such that \[ F_X(x) = \int_{-\infty}^x f_X(x) dx \] Then $f_X(x)$ is called the probability density function (PDF) of $X$ and $X$ is said to be a continuous random variable.
$f_X(x)$ is sometimes denoted as $\mathrm{d}\Pr(x \le X \le x+\mathrm{d}x)/\mathrm{d}x$.
Since the $\sigma$-algebra generated by sets of the form $(-\infty, a]$ is $\mathcal{B}(\mathbb{R})$, $\Ecal$ is often chosen to be $\mathcal{B}(\mathbb{R})$ so that $F_X$ completely characterizes $\Pr_X$. If $X$ is a continuous random variable, this would mean that $f_X$ completely characterizes $\Pr_X$.
Let $X = [X_1, X_2, \ldots, X_n]$ be a sequence of random variables. Suppose there exists a function $f_X: \mathbb{R}^n \mapsto \mathbb{R}_{\ge 0}$ such that \[ F_X(x_1, x_2, \ldots, x_n) = \int_{-\infty}^{x_1} \int_{-\infty}^{x_2} \ldots \int_{-\infty}^{x_n} f_X(x_1, x_2, \ldots, x_n) dx_1 dx_2 \ldots dx_n \] Then $f_X$ is called the probability density function (PDF) of $X$ and $X$ is said to be a multivariate continuous random variable.
Dependency for:
- Markov's bound
- Cantelli's inequality
- Chernoff bound
- Poisson distribution
- Bernoulli random variable
- Linearity of expectation
- Counting process
- X ≤ Y ⟹ E(X) ≤ E(Y)
- Independence of random variables (incomplete)
- Cauchy-Schwarz inequality for random variables
- Law of total probability: decomposing expectation over countable events
- Law of total probability: P(A) = E(P(A|X)) (incomplete)
- Conditional expectation
- Linearity of expectation for matrices
- Distribution of sum of random variables (incomplete)
- Law of total probability: E(Y) = E(E(Y|X)) (incomplete)
- Expectation of product of independent random variables (incomplete)
- Conditioning over random variable
- Probability: limit of CDF
- Expected value of a random variable
- Var(Y) = Var(E(Y|X)) + E(Var(Y|X))
- Covariance of 2 random variables
- Cross-covariance matrix
- Conditional variance
- Variance of a random variable
- Covariance matrix
- Median of a random variable
- Standard multivariate normal distribution
- Normal distribution
- Markov chain
Info:
- Depth: 5
- Number of transitive dependencies: 11
Transitive dependencies:
- /analysis/topological-space
- /sets-and-relations/countable-set
- /sets-and-relations/de-morgan-laws
- σ-algebra
- Generated σ-algebra
- Borel algebra
- Measurable function
- Generators of the real Borel algebra (incomplete)
- Measure
- σ-algebra is closed under countable intersections
- Probability