Independence of events
Dependencies:
Events $A$ and $B$ are said to be independent for the probability measure $\Pr$ iff $\Pr(A \cap B) = \Pr(A)\Pr(B)$.
It is easy to see that independence is a symmetric relation.
If for event $C$, $\Pr_{|C}$ is defined and $A$ and $B$ are independent for $\Pr_{|C}$, then $A$ and $B$ are said to be independent conditioned on $C$.
Let $S = \{A_1, A_2, \ldots\}$ be a countable set of events. Then events in $S$ are said to be independent iff for all $T \subseteq S$, \[ \Pr\left(\bigcap_{A \in T} A\right) = \prod_{A \in T} \Pr(A) \] The events in $S$ are said to be pairwise independent iff any 2 distinct events in $S$ are independent.
Dependency for:
Info:
- Depth: 5
- Number of transitive dependencies: 7
Transitive dependencies:
- /sets-and-relations/countable-set
- /sets-and-relations/de-morgan-laws
- σ-algebra
- Measure
- σ-algebra is closed under countable intersections
- Probability
- Conditional probability (incomplete)