Mathwords logoMathwords

Orthogonal Decomposition — Definition, Formula & Examples

Orthogonal decomposition is the process of writing a vector as the sum of two parts: one that lies in a given subspace and one that is perpendicular to that subspace.

Given a subspace WW of Rn\mathbb{R}^n and a vector yRn\mathbf{y} \in \mathbb{R}^n, the orthogonal decomposition of y\mathbf{y} is y=y^+z\mathbf{y} = \hat{\mathbf{y}} + \mathbf{z}, where y^=projWyW\hat{\mathbf{y}} = \text{proj}_W \mathbf{y} \in W and z=yy^W\mathbf{z} = \mathbf{y} - \hat{\mathbf{y}} \in W^\perp. This decomposition is unique.

Key Formula

y=y^+z,y^=i=1pyuiuiuiui,z=yy^\mathbf{y} = \hat{\mathbf{y}} + \mathbf{z}, \quad \hat{\mathbf{y}} = \sum_{i=1}^{p} \frac{\mathbf{y} \cdot \mathbf{u}_i}{\mathbf{u}_i \cdot \mathbf{u}_i}\,\mathbf{u}_i, \quad \mathbf{z} = \mathbf{y} - \hat{\mathbf{y}}
Where:
  • y\mathbf{y} = The vector being decomposed
  • y^\hat{\mathbf{y}} = The orthogonal projection of y onto subspace W
  • z\mathbf{z} = The component of y perpendicular to W
  • ui\mathbf{u}_i = Orthogonal basis vectors of subspace W

How It Works

To decompose a vector y\mathbf{y} with respect to a subspace WW, you first find the orthogonal projection y^\hat{\mathbf{y}} onto WW. If {u1,,up}\{\mathbf{u}_1, \dots, \mathbf{u}_p\} is an orthogonal basis for WW, then y^=i=1pyuiuiuiui\hat{\mathbf{y}} = \sum_{i=1}^{p} \frac{\mathbf{y} \cdot \mathbf{u}_i}{\mathbf{u}_i \cdot \mathbf{u}_i} \mathbf{u}_i. The perpendicular component is simply z=yy^\mathbf{z} = \mathbf{y} - \hat{\mathbf{y}}. You can verify correctness by checking that zui=0\mathbf{z} \cdot \mathbf{u}_i = 0 for every basis vector of WW.

Worked Example

Problem: Decompose y=[641]\mathbf{y} = \begin{bmatrix} 6 \\ 4 \\ 1 \end{bmatrix} with respect to W=Span{u1}W = \text{Span}\{\mathbf{u}_1\} where u1=[212]\mathbf{u}_1 = \begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix}.
Step 1: Compute the projection of y onto W using the projection formula.
y^=yu1u1u1u1=12+4+24+1+4[212]=189[212]=[424]\hat{\mathbf{y}} = \frac{\mathbf{y} \cdot \mathbf{u}_1}{\mathbf{u}_1 \cdot \mathbf{u}_1}\,\mathbf{u}_1 = \frac{12 + 4 + 2}{4 + 1 + 4}\begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix} = \frac{18}{9}\begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix} = \begin{bmatrix} 4 \\ 2 \\ 4 \end{bmatrix}
Step 2: Find the perpendicular component by subtracting.
z=yy^=[641][424]=[223]\mathbf{z} = \mathbf{y} - \hat{\mathbf{y}} = \begin{bmatrix} 6 \\ 4 \\ 1 \end{bmatrix} - \begin{bmatrix} 4 \\ 2 \\ 4 \end{bmatrix} = \begin{bmatrix} 2 \\ 2 \\ -3 \end{bmatrix}
Step 3: Verify orthogonality: zu1=4+26=0\mathbf{z} \cdot \mathbf{u}_1 = 4 + 2 - 6 = 0. Confirmed.
zu1=(2)(2)+(2)(1)+(3)(2)=0\mathbf{z} \cdot \mathbf{u}_1 = (2)(2) + (2)(1) + (-3)(2) = 0 \checkmark
Answer: y=[424]+[223]\mathbf{y} = \begin{bmatrix} 4 \\ 2 \\ 4 \end{bmatrix} + \begin{bmatrix} 2 \\ 2 \\ -3 \end{bmatrix}, where the first vector is in WW and the second is in WW^\perp.

Why It Matters

Orthogonal decomposition underlies least-squares regression, which finds the best-fit line or curve when a system of equations has no exact solution. It is also central to the Gram-Schmidt process and QR factorization, tools used heavily in numerical computing, data science, and signal processing.

Common Mistakes

Mistake: Using a basis for W that is not orthogonal in the projection formula.
Correction: The summation formula requires an orthogonal basis. If your basis is not orthogonal, first apply the Gram-Schmidt process to orthogonalize it, or use the matrix formula y^=A(ATA)1ATy\hat{\mathbf{y}} = A(A^TA)^{-1}A^T\mathbf{y} instead.