Weak Law of Large Numbers — Definition, Formula & Examples
The Weak Law of Large Numbers is a theorem stating that as you take more and more observations from a random process, the sample average gets closer and closer to the true expected value. More precisely, the probability that the sample mean differs from the population mean by any given amount shrinks toward zero as the sample size grows.
Let be independent and identically distributed random variables with finite expected value and finite variance . The Weak Law of Large Numbers states that for every , the sample mean satisfies . This mode of convergence is called convergence in probability.
Key Formula
Where:
- = Sample mean of n independent observations
- = Population mean (expected value)
- = Population variance (assumed finite)
- = Sample size
- = Any positive tolerance around the mean
How It Works
Pick any tolerance , no matter how small. The theorem guarantees that with a large enough sample, the probability that your sample mean lands outside the interval becomes negligibly small. A common proof uses Chebyshev's inequality: since , we get , which clearly tends to zero. The law does not say any single sample mean equals ; it describes a probabilistic trend over increasing sample sizes.
Worked Example
Problem: A fair die has mean and variance . You roll it times and compute the sample mean. Use the Chebyshev bound to find an upper bound on the probability that the sample mean differs from 3.5 by 0.2 or more.
Identify values: We have , , , and .
Apply the bound: Substitute into the Chebyshev-based inequality from the Weak Law.
Answer: The probability that the sample mean is more than 0.2 away from 3.5 is at most about 7.3%. With even larger , this bound shrinks further toward zero, illustrating the Weak Law.
Why It Matters
The Weak Law of Large Numbers is the theoretical backbone of statistical estimation: it justifies using sample averages to estimate population parameters. Polling organizations, quality-control engineers, and actuaries all rely on this principle when they assume that larger samples yield more reliable estimates.
Common Mistakes
Mistake: Confusing the Weak Law with the Strong Law of Large Numbers.
Correction: The Weak Law guarantees convergence in probability (the chance of a big deviation vanishes). The Strong Law guarantees almost sure convergence (the sample mean equals in the limit with probability 1). The Strong Law is a strictly stronger result.
