Factoring — Definition, Formula & Examples
Factoring is the process of rewriting a mathematical expression as a product of two or more simpler expressions. For example, you can factor into .
To factor a polynomial over the integers is to express it as a product , where is an integer and each is a polynomial of degree at least 1 with integer coefficients that cannot be factored further (i.e., each is irreducible over ).
Key Formula
Where:
- = Any expression (variable, number, or combination)
- = Any expression (variable, number, or combination)
How It Works
Factoring reverses the multiplication of polynomials. You start with a single expression and break it into pieces whose product equals the original. The most common techniques are: pulling out a greatest common factor (GCF), factoring trinomials of the form , using the difference of squares identity, and grouping terms. Once a polynomial is fully factored, you can set each factor equal to zero to solve equations — this is the zero-product property. Choosing the right method depends on the number of terms, the degree, and any recognizable patterns in the expression.
Worked Example
Problem: Factor the trinomial completely.
Step 1: Identify two numbers that multiply to 12 (the constant term) and add to 7 (the coefficient of ).
Step 2: Test factor pairs of 12: (1,12), (2,6), (3,4). The pair 3 and 4 satisfies both conditions because and .
Step 3: Write the factored form using these values.
Answer:
Another Example
Problem: Factor completely.
Step 1: Factor out the greatest common factor. Both terms share a factor of 6.
Step 2: Recognize that is a difference of squares: .
Step 3: Combine everything for the fully factored result.
Answer:
Why It Matters
Factoring is essential in Algebra 1, Algebra 2, and Precalculus for solving quadratic equations, simplifying rational expressions, and finding polynomial roots. Engineers and physicists use it to solve design equations, while data scientists factor expressions when simplifying statistical models. Mastering factoring also builds the foundation for partial fraction decomposition in calculus.
Common Mistakes
Mistake: Forgetting to factor out the GCF first, which makes the remaining expression harder — or impossible — to factor by other methods.
Correction: Always check for a greatest common factor before trying any other technique. For instance, should first become , which then factors as .
Mistake: Incorrectly applying the difference of squares pattern to a sum of squares, writing .
Correction: The identity only works for a difference (minus sign). The sum does not factor over the real numbers.
