Mathwords logoMathwords

Block Matrix — Definition, Formula & Examples

A block matrix is a matrix that has been partitioned into smaller rectangular submatrices (called blocks) by dividing its rows and columns into groups. You can then treat each block as a single entry when performing operations like addition and multiplication.

Given an m×nm \times n matrix AA, a block partition of AA is a decomposition obtained by selecting row indices 1=r0<r1<<rp=m+11 = r_0 < r_1 < \cdots < r_p = m+1 and column indices 1=c0<c1<<cq=n+11 = c_0 < c_1 < \cdots < c_q = n+1 so that AA is written as a p×qp \times q array of submatrices AijA_{ij}, where each AijA_{ij} is the (riri1)×(cjcj1)(r_i - r_{i-1}) \times (c_j - c_{j-1}) submatrix occupying rows ri1r_{i-1} through ri1r_i - 1 and columns cj1c_{j-1} through cj1c_j - 1.

Key Formula

Cij=k=1sAikBkjC_{ij} = \sum_{k=1}^{s} A_{ik} B_{kj}
Where:
  • AikA_{ik} = The $(i,k)$ block of the first matrix
  • BkjB_{kj} = The $(k,j)$ block of the second matrix
  • CijC_{ij} = The $(i,j)$ block of the product matrix
  • ss = Number of block columns of A (equivalently, block rows of B)

How It Works

To form a block matrix, draw horizontal and vertical lines through the matrix to carve it into rectangular pieces. Each piece is a block. When multiplying two block matrices, you can use the standard row-times-column rule but with blocks in place of scalar entries, provided the partitions are conformable — meaning the column partition of the first matrix matches the row partition of the second. The product block CijC_{ij} is then kAikBkj\sum_k A_{ik} B_{kj}, where each term is a matrix product rather than a scalar product.

Worked Example

Problem: Let A be the 4×4 matrix partitioned into four 2×2 blocks. Compute the product AB in block form, where A and B are partitioned conformably.
Partition: Write A and B each as 2×2 arrays of 2×2 blocks.
A=[A11A12A21A22],A11=[1001],  A12=[2002],  A21=[0000],  A22=[3003]A = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix}, \quad A_{11}=\begin{bmatrix}1&0\\0&1\end{bmatrix},\; A_{12}=\begin{bmatrix}2&0\\0&2\end{bmatrix},\; A_{21}=\begin{bmatrix}0&0\\0&0\end{bmatrix},\; A_{22}=\begin{bmatrix}3&0\\0&3\end{bmatrix}
Define B: Let B have blocks that are also 2×2.
B11=[1111],  B12=[0000],  B21=[1001],  B22=[2222]B_{11}=\begin{bmatrix}1&1\\1&1\end{bmatrix},\; B_{12}=\begin{bmatrix}0&0\\0&0\end{bmatrix},\; B_{21}=\begin{bmatrix}1&0\\0&1\end{bmatrix},\; B_{22}=\begin{bmatrix}2&2\\2&2\end{bmatrix}
Compute C₁₁: Apply the block multiplication formula for the (1,1) block of the product.
C11=A11B11+A12B21=[1111]+[2002]=[3113]C_{11} = A_{11}B_{11} + A_{12}B_{21} = \begin{bmatrix}1&1\\1&1\end{bmatrix} + \begin{bmatrix}2&0\\0&2\end{bmatrix} = \begin{bmatrix}3&1\\1&3\end{bmatrix}
Answer: The (1,1) block of the product is [3113]\begin{bmatrix}3&1\\1&3\end{bmatrix}. The remaining blocks are computed the same way using Cij=Ai1B1j+Ai2B2jC_{ij}=A_{i1}B_{1j}+A_{i2}B_{2j}.

Why It Matters

Block matrices simplify proofs and computations throughout linear algebra, especially for large structured matrices. In numerical computing and parallel processing, algorithms partition matrices into blocks to exploit cache efficiency and distribute work across processors. Topics like block diagonal matrices and the Schur complement arise directly from block partitioning.

Common Mistakes

Mistake: Multiplying blocks from non-conformable partitions, where inner block dimensions do not match.
Correction: For block multiplication to work, the column partition of the left matrix must equal the row partition of the right matrix, so that each product A_{ik}B_{kj} involves compatible matrix dimensions.