Significant Digits
Significant Digits
A way of describing how precisely a number is written, particularly when the number is a measurement.

Worked Example
Problem: How many significant digits does each of the following numbers have: 4500, 4500.0, 0.0032, and 10.07?
Step 1: 4500 — The digits 4 and 5 are significant. The trailing zeros have no decimal point to anchor them, so they are not considered significant.
4500→2 significant digits
Step 2: 4500.0 — The decimal point signals that every digit is intentional. All five digits, including the trailing zeros, are significant.
4500.0→5 significant digits
Step 3: 0.0032 — Leading zeros only serve as placeholders and are never significant. Only the 3 and the 2 count.
0.0032→2 significant digits
Step 4: 10.07 — The zero sits between nonzero digits (1 and 7), so it is significant. All four digits count.
10.07→4 significant digits
Answer: 4500 has 2 significant digits; 4500.0 has 5; 0.0032 has 2; and 10.07 has 4.
Why It Matters
When you measure something in a science lab, your tool has a limited precision—a ruler marked in millimeters cannot reliably give you a length to the nearest micrometer. Significant digits communicate that limitation so that calculations based on your measurement don't imply a false level of accuracy. In any field that relies on measured data, correctly tracking significant digits prevents misleading results.
Common Mistakes
Mistake: Counting leading zeros (like the zeros in 0.0032) as significant.
Correction: Leading zeros are only placeholders that locate the decimal point. They never count as significant digits. In 0.0032, only the 3 and the 2 are significant.
Related Terms
- Precision — Describes the level of detail in a measurement
- Measurement — The context where significant digits matter most
- Rounding a Number — Used to express answers to correct sig figs
- Scientific Notation — Makes the number of significant digits unambiguous
- Accuracy — Closeness to the true value, related to precision
