Eigenvalue
An eigenvalue is a special scalar associated with a square matrix , where multiplying by some nonzero vector gives the same result as multiplying by . In other words, the matrix stretches or flips the vector without changing its direction.
Given a square matrix of size , a scalar is called an eigenvalue of if there exists a nonzero vector such that . The vector is called the eigenvector corresponding to . Eigenvalues are found by solving the characteristic equation , where is the identity matrix. A matrix of size has at most eigenvalues, though some may be repeated or complex.
Key Formula
Where:
- = a square matrix
- = the eigenvalue (the unknown to solve for)
- = the identity matrix of the same size as A
- = the determinant of the resulting matrix
Worked Example
Problem: Find the eigenvalues of the matrix .
Step 1: Set up the matrix by subtracting from each diagonal entry.
Step 2: Compute the determinant of .
Step 3: Set the determinant equal to zero and solve the characteristic equation.
Step 4: Factor the quadratic to find the eigenvalues.
Answer: The eigenvalues of the matrix are and .
Why It Matters
Eigenvalues appear throughout science and engineering. In physics, they describe the natural frequencies at which a structure vibrates. In data science, principal component analysis (PCA) uses eigenvalues to determine which dimensions of a dataset carry the most information. Stability analysis in differential equations also depends on whether eigenvalues are positive, negative, or complex.
Common Mistakes
Mistake: Forgetting to subtract from every diagonal entry of .
Correction: The matrix changes only the diagonal entries. Each diagonal entry becomes , while all off-diagonal entries stay the same.
Mistake: Allowing the zero vector as an eigenvector.
Correction: The equation is trivially satisfied when , so eigenvectors must be nonzero by definition. The interesting question is whether a nonzero solution exists.
