Orthogonal Decomposition — Definition, Formula & Examples
Orthogonal decomposition is the process of writing a vector as the sum of two parts: one that lies in a given subspace and one that is perpendicular to that subspace.
Given a subspace W of Rn and a vector y∈Rn, the orthogonal decomposition of y is y=y^+z, where y^=projWy∈W and z=y−y^∈W⊥. This decomposition is unique.
Key Formula
y=y^+z,y^=i=1∑pui⋅uiy⋅uiui,z=y−y^
Where:
y = The vector being decomposed
y^ = The orthogonal projection of y onto subspace W
z = The component of y perpendicular to W
ui = Orthogonal basis vectors of subspace W
How It Works
To decompose a vector y with respect to a subspace W, you first find the orthogonal projection y^ onto W. If {u1,…,up} is an orthogonal basis for W, then y^=∑i=1pui⋅uiy⋅uiui. The perpendicular component is simply z=y−y^. You can verify correctness by checking that z⋅ui=0 for every basis vector of W.
Worked Example
Problem:Decompose y=641 with respect to W=Span{u1} where u1=212.
Step 1: Compute the projection of y onto W using the projection formula.
Answer:y=424+22−3, where the first vector is in W and the second is in W⊥.
Why It Matters
Orthogonal decomposition underlies least-squares regression, which finds the best-fit line or curve when a system of equations has no exact solution. It is also central to the Gram-Schmidt process and QR factorization, tools used heavily in numerical computing, data science, and signal processing.
Common Mistakes
Mistake: Using a basis for W that is not orthogonal in the projection formula.
Correction:The summation formula requires an orthogonal basis. If your basis is not orthogonal, first apply the Gram-Schmidt process to orthogonalize it, or use the matrix formula y^=A(ATA)−1ATy instead.