Orthogonal Vectors — Definition, Formula & Examples
Orthogonal vectors are two vectors that meet at a right angle (90°). You can test for orthogonality by checking whether their dot product equals zero.
Two vectors and in an inner product space are orthogonal if and only if . The zero vector is considered orthogonal to every vector by convention.
Key Formula
Where:
- = The two vectors being tested for orthogonality
- = The corresponding components of each vector
- = The dimension of the vector space
How It Works
To determine whether two vectors are orthogonal, compute their dot product. Multiply corresponding components and sum the results. If the sum is exactly zero, the vectors are orthogonal. This works in any dimension — two components, three, or more. No angle calculation is needed; the dot product alone gives a definitive answer.
Worked Example
Problem: Determine whether and are orthogonal.
Compute the dot product: Multiply corresponding components and add them together.
Evaluate the sum: Add the three products.
Answer: The dot product is , so and are orthogonal.
Why It Matters
Orthogonality is foundational in linear algebra and applied mathematics. Orthogonal bases simplify projections, least-squares approximations, and the Gram-Schmidt process. In fields like computer graphics, signal processing, and machine learning, decomposing data along orthogonal directions makes computation faster and more numerically stable.
Common Mistakes
Mistake: Confusing orthogonal with parallel. Students sometimes check if the dot product is zero and conclude the vectors point the same direction.
Correction: A zero dot product means the vectors are perpendicular (orthogonal). Parallel vectors have a dot product equal to the product of their magnitudes (or its negative), i.e., .
