Tolerance — Definition, Formula & Examples
Tolerance is the amount a measurement is allowed to differ from a target value and still be considered acceptable. It defines how much error or variation is permitted.
Tolerance is the maximum permissible deviation from a specified nominal value, expressed as a range within which a measured quantity must fall to meet a given standard. If the nominal value is and the tolerance is , then any measurement in the interval is acceptable.
Key Formula
Where:
- = Nominal (target) value
- = Tolerance, the maximum allowed deviation from the target
How It Works
You start with a target (nominal) value and a stated tolerance. Subtract the tolerance from the target to get the lower bound, and add it to get the upper bound. Any measurement that falls within this interval passes; anything outside it does not. Tolerance is often written with a plus-or-minus symbol, such as cm.
Worked Example
Problem: A bolt must be 12 mm long with a tolerance of ±0.3 mm. A bolt measures 12.2 mm. Does it pass?
Find lower bound: Subtract the tolerance from the target value.
Find upper bound: Add the tolerance to the target value.
Check the measurement: Determine whether 12.2 mm falls within the acceptable range.
Answer: Yes, 12.2 mm is within the acceptable range of 11.7 mm to 12.3 mm, so the bolt passes.
Why It Matters
Tolerance shows up whenever real-world measurements matter — in manufacturing, engineering, and science labs. In middle-school math and science classes, understanding tolerance helps you judge whether a measurement result is close enough to be acceptable, which builds skills you use later in statistics and quality control.
Common Mistakes
Mistake: Treating the tolerance value as the full range instead of half the range.
Correction: A tolerance of ±0.3 means the total acceptable range is 0.6, not 0.3. Always apply the tolerance in both directions from the target.
