Norm — Definition, Formula & Examples
A norm is a function that assigns a non-negative length or size to a vector in a vector space. The most common norm is the Euclidean norm, which gives the straight-line distance from the origin to the tip of the vector.
A norm on a vector space over (or ) is a function satisfying three axioms: (1) with equality if and only if (positive definiteness), (2) for any scalar (absolute homogeneity), and (3) (triangle inequality).
Key Formula
Where:
- = A vector in $\mathbb{R}^n$
- = The components of the vector
- = The Euclidean (L²) norm of the vector
How It Works
To compute the Euclidean norm (also called the norm or 2-norm), square each component of the vector, sum the squares, and take the square root. Other norms exist: the norm sums the absolute values of the components, and the norm takes the largest absolute component value. All norms satisfy the same three axioms, but they measure "size" differently. Which norm you use depends on the application — the Euclidean norm captures geometric distance, while the norm is common in optimization and data science.
Worked Example
Problem: Find the Euclidean norm of the vector .
Square each component: Compute the square of each entry.
Sum the squares: Add the results together.
Take the square root: Apply the square root to get the norm.
Answer:
Why It Matters
Norms appear throughout applied mathematics and engineering — computing distances between data points in machine learning, measuring error in numerical methods, and defining convergence in functional analysis. Normalizing a vector (dividing by its norm to get a unit vector) is a routine operation in physics, computer graphics, and signal processing.
Common Mistakes
Mistake: Forgetting to square the components before summing, and instead summing the absolute values.
Correction: That computes the norm, not the Euclidean norm. For the norm, you must square each component, sum, then take the square root.
