The Notebook
İstanbul, Turkey

Weak Law of Large Numbers

This post aims to explain Weak Law of Large numbers. To do so, we first introduce Chebyshev Inequality, then provide the theorem and proof of the weak law of large numbers.

Chebyshev Inequality:

Let X be a random variable with mean E[X] = \mu and variance \mathrm{Var}[X] = \sigma^2.

Then, the Chebyshev inequality states that

P(|X - \mu| \geq t) \leq \frac{\sigma^2}{t^2}, for t > 0.

Let A denote the event |X - \mu| \geq t, and A^{c} the complementary event (|X - \mu| < \mu )

\sigma^2 = E[(X - \mu)^2|A]P(A) + E[(X - \mu)^2|A^{c}]P(A^{c})

\sigma^2 \geq E[(X - \mu)^2|A]P(A)

since 0 \leq P(A^{c}) \leq 1 and E[(X - \mu)^2|A^{c}] \geq 0.

So, whenever A occurs, |X - \mu| \geq t, which implies (X - \mu)^{2} \geq t^{2}. Here, we can say that: E[(X - \mu)^2 | A] \geq t^{2}

finally, by using third equation, can say that

\sigma^2 \geq t^2 P(A), which alternatively means: P(|X - \mu| \geq t) \leq \frac{\sigma^2}{t^2}

Weak Law of Large Numbers:

Let X_1, X_2, \dots X_n be an independent trials process, with finited expected value \mu = E[X_j] and finite variance \sigma^{2} = \mathrm{Var}[X].
Let S_n = X_1 + X_2 + X_3 + \dots + X_n.

P(|\frac{S_n}{n} - \mu| \geq t) \rightarrow 0, for t > 0 as n \rightarrow \infty.

Since X_1, X_2, \dots X_n are independent and have the same distributions, we can say that:

\mathrm{Var[S_n]} = n \sigma^2


\mathrm{Var[\frac{S_n}{n}]} = \frac{\sigma^2}{n}

By Chebyshev's inequality, for any t > 0,

P(|\frac{S_n}{n} - \mu| \geq t) \leq \frac{\sigma^{2}}{nt^{2}}

Thus, for fixed t

P(|\frac{S_n}{n} - \mu| \geq t) \rightarrow 0

as n \rightarrow \infty, or equivalently

P(|\frac{S_n}{n} - \mu| < t) \rightarrow 1

as n \rightarrow \infty