Expectation Value — Definition, Formula & Examples
Expectation value is the long-run average outcome of a random variable, calculated by weighting each possible value by its probability. It tells you what result to 'expect' on average over many repetitions of an experiment.
For a discrete random variable with possible values and corresponding probabilities , the expectation value (or expected value) is defined as , provided the sum converges absolutely.
Key Formula
Where:
- = Expected value of the random variable X
- = The i-th possible outcome of X
- = Probability that X equals x_i
- = Number of possible outcomes
How It Works
To compute , multiply each outcome by its probability, then add all the products together. The result is not necessarily a value that can actually take — for example, the expected value of a fair die roll is 3.5, which is not a face on the die. Think of it as the balance point of the probability distribution. When the same experiment is repeated many times, the sample mean converges to by the law of large numbers.
Worked Example
Problem: A game pays $0 with probability 0.5, $10 with probability 0.3, and $50 with probability 0.2. What is the expected payout?
Step 1: Multiply each payout by its probability.
Step 2: Sum all the products.
Answer: The expected payout is $13 per play of the game.
Why It Matters
Expected value is central to decision-making under uncertainty. Insurance companies use it to set premiums, investors use it to compare portfolios, and it appears throughout courses in probability, statistics, quantum mechanics, and game theory.
Common Mistakes
Mistake: Averaging the outcomes without weighting by probability (e.g., computing (0 + 10 + 50)/3 = 20 instead of 13).
Correction: Each outcome must be multiplied by its own probability before summing. Equal weighting is only correct when all outcomes are equally likely.
