Gram Matrix — Definition, Formula & Examples
A Gram matrix is a matrix of all pairwise inner products of a set of vectors. If your vectors are the columns of a matrix , the Gram matrix is .
Given vectors in an inner product space, the Gram matrix is the matrix whose -entry is . When these vectors form the columns of a real matrix , the Gram matrix equals .
Key Formula
Where:
- = An m × n real matrix whose columns are the vectors v₁, …, vₙ
- = The n × n Gram matrix of inner products
- = The dot product of the i-th and j-th column vectors of A
How It Works
Each entry tells you the dot product of the -th and -th vectors. Diagonal entries give the squared lengths. Off-diagonal entries measure how aligned two vectors are. The Gram matrix is always symmetric and positive semi-definite. Its determinant equals zero exactly when the vectors are linearly dependent, which makes it a direct test for linear independence.
Worked Example
Problem: Find the Gram matrix for the columns of A = [[1, 2], [3, 0], [0, 1]].
Step 1: Write out A and compute Aᵀ.
Step 2: Multiply Aᵀ A to get the 2×2 Gram matrix.
Step 3: Interpret the result. The diagonal entries 10 and 5 are the squared norms of the two column vectors. The off-diagonal entry 2 is their dot product.
Answer: The Gram matrix is .
Why It Matters
The Gram matrix appears in least-squares problems, where the normal equations use it directly. It is also central to kernel methods in machine learning, where you compute inner products without explicitly working in high-dimensional feature spaces.
Common Mistakes
Mistake: Computing instead of and calling it the Gram matrix of the columns.
Correction: gives inner products of the row vectors (or equivalently columns of ). For the Gram matrix of the columns of , you need .
