Independence of events

Dependencies:

  1. Conditional probability (incomplete)

Events $A$ and $B$ are said to be independent for the probability measure $\Pr$ iff $\Pr(A \cap B) = \Pr(A)\Pr(B)$.

It is easy to see that independence is a symmetric relation.

If for event $C$, $\Pr_{|C}$ is defined and $A$ and $B$ are independent for $\Pr_{|C}$, then $A$ and $B$ are said to be independent conditioned on $C$.

Let $S = \{A_1, A_2, \ldots\}$ be a countable set of events. Then events in $S$ are said to be independent iff for all $T \subseteq S$, \[ \Pr\left(\bigcap_{A \in T} A\right) = \prod_{A \in T} \Pr(A) \] The events in $S$ are said to be pairwise independent iff any 2 distinct events in $S$ are independent.

Dependency for:

  1. Independence of random variables (incomplete)
  2. Independence of composite events

Info:

Transitive dependencies:

  1. /sets-and-relations/countable-set
  2. /sets-and-relations/de-morgan-laws
  3. σ-algebra
  4. Measure
  5. σ-algebra is closed under countable intersections
  6. Probability
  7. Conditional probability (incomplete)