Mathwords logoMathwords

Weak Law of Large Numbers — Definition, Formula & Examples

The Weak Law of Large Numbers is a theorem stating that as you take more and more observations from a random process, the sample average gets closer and closer to the true expected value. More precisely, the probability that the sample mean differs from the population mean by any given amount shrinks toward zero as the sample size grows.

Let X1,X2,,XnX_1, X_2, \ldots, X_n be independent and identically distributed random variables with finite expected value μ\mu and finite variance σ2\sigma^2. The Weak Law of Large Numbers states that for every ϵ>0\epsilon > 0, the sample mean Xˉn=1ni=1nXi\bar{X}_n = \frac{1}{n}\sum_{i=1}^{n} X_i satisfies limnP ⁣(Xˉnμϵ)=0\lim_{n \to \infty} P\!\left(|\bar{X}_n - \mu| \geq \epsilon\right) = 0. This mode of convergence is called convergence in probability.

Key Formula

P ⁣(Xˉnμϵ)σ2nϵ2P\!\left(\left|\bar{X}_n - \mu\right| \geq \epsilon\right) \leq \frac{\sigma^2}{n\,\epsilon^2}
Where:
  • Xˉn\bar{X}_n = Sample mean of n independent observations
  • μ\mu = Population mean (expected value)
  • σ2\sigma^2 = Population variance (assumed finite)
  • nn = Sample size
  • ϵ\epsilon = Any positive tolerance around the mean

How It Works

Pick any tolerance ϵ>0\epsilon > 0, no matter how small. The theorem guarantees that with a large enough sample, the probability that your sample mean lands outside the interval (μϵ,  μ+ϵ)(\mu - \epsilon,\; \mu + \epsilon) becomes negligibly small. A common proof uses Chebyshev's inequality: since Var(Xˉn)=σ2/n\text{Var}(\bar{X}_n) = \sigma^2/n, we get P(Xˉnμϵ)σ2/(nϵ2)P(|\bar{X}_n - \mu| \geq \epsilon) \leq \sigma^2/(n\epsilon^2), which clearly tends to zero. The law does not say any single sample mean equals μ\mu; it describes a probabilistic trend over increasing sample sizes.

Worked Example

Problem: A fair die has mean μ=3.5\mu = 3.5 and variance σ2=35/122.917\sigma^2 = 35/12 \approx 2.917. You roll it n=1,000n = 1{,}000 times and compute the sample mean. Use the Chebyshev bound to find an upper bound on the probability that the sample mean differs from 3.5 by 0.2 or more.
Identify values: We have μ=3.5\mu = 3.5, σ2=35/12\sigma^2 = 35/12, n=1000n = 1000, and ϵ=0.2\epsilon = 0.2.
Apply the bound: Substitute into the Chebyshev-based inequality from the Weak Law.
P ⁣(Xˉ10003.50.2)35/121000×0.04=2.917400.0729P\!\left(|\bar{X}_{1000} - 3.5| \geq 0.2\right) \leq \frac{35/12}{1000 \times 0.04} = \frac{2.917}{40} \approx 0.0729
Answer: The probability that the sample mean is more than 0.2 away from 3.5 is at most about 7.3%. With even larger nn, this bound shrinks further toward zero, illustrating the Weak Law.

Why It Matters

The Weak Law of Large Numbers is the theoretical backbone of statistical estimation: it justifies using sample averages to estimate population parameters. Polling organizations, quality-control engineers, and actuaries all rely on this principle when they assume that larger samples yield more reliable estimates.

Common Mistakes

Mistake: Confusing the Weak Law with the Strong Law of Large Numbers.
Correction: The Weak Law guarantees convergence in probability (the chance of a big deviation vanishes). The Strong Law guarantees almost sure convergence (the sample mean equals μ\mu in the limit with probability 1). The Strong Law is a strictly stronger result.