Schwarz's Inequality — Definition, Formula & Examples
Schwarz's Inequality states that the absolute value of the inner product of two vectors (or the sum of products of corresponding terms) is always less than or equal to the product of their magnitudes. It provides a fundamental upper bound that appears throughout linear algebra, analysis, and probability.
For vectors and in an inner product space, , with equality holding if and only if and are linearly dependent. This is also known as the Cauchy–Schwarz inequality.
Key Formula
Where:
- = Components of the first vector or sequence of real numbers
- = Components of the second vector or sequence of real numbers
- = Number of components
How It Works
In the finite-dimensional real case, the inequality says . To verify or apply it, compute each side separately and confirm the left side does not exceed the right. Equality occurs precisely when one list of values is a constant multiple of the other, meaning for all . The inequality also justifies why the cosine formula always produces a value between and .
Worked Example
Problem: Verify Schwarz's Inequality for a = (1, 2, 3) and b = (4, 5, 6).
Compute the inner product: Find the dot product of the two vectors.
Compute the left side: Square the inner product.
Compute the right side: Find the product of the squared magnitudes.
Answer: Since , the inequality holds. Equality does not hold because is not a scalar multiple of .
Why It Matters
Schwarz's Inequality is a prerequisite for proving the triangle inequality in normed spaces and for defining angles between vectors in any dimension. It appears directly in courses on linear algebra, real analysis, and quantum mechanics, and underpins results in statistics such as the correlation coefficient always lying between and .
Common Mistakes
Mistake: Forgetting to square the inner product on the left side and comparing directly to .
Correction: The standard form compares to . Equivalently, you can compare to (without squaring the norms), but be consistent on both sides.
