Mathwords logoMathwords

Big Omega Notation — Definition, Formula & Examples

Big Omega notation, written Ω(g(n))\Omega(g(n)), describes a lower bound on how fast a function grows. It tells you the minimum growth rate of an algorithm or function for sufficiently large inputs.

A function f(n)f(n) is in Ω(g(n))\Omega(g(n)) if there exist positive constants cc and n0n_0 such that f(n)cg(n)f(n) \geq c \cdot g(n) for all nn0n \geq n_0. This means g(n)g(n) grows no faster than f(n)f(n) asymptotically.

Key Formula

f(n)Ω(g(n))    c>0,  n0>0   such that   f(n)cg(n)  nn0f(n) \in \Omega(g(n)) \iff \exists\, c > 0,\; n_0 > 0 \;\text{ such that }\; f(n) \geq c \cdot g(n) \;\forall\, n \geq n_0
Where:
  • f(n)f(n) = The function being analyzed (e.g., an algorithm's running time)
  • g(n)g(n) = The lower-bound reference function
  • cc = A positive constant multiplier
  • n0n_0 = The threshold beyond which the inequality holds

How It Works

To show that f(n)Ω(g(n))f(n) \in \Omega(g(n)), you must find specific constants c>0c > 0 and n0>0n_0 > 0 such that f(n)cg(n)f(n) \geq c \cdot g(n) holds for every nn0n \geq n_0. Think of Big Omega as the mirror image of Big O: while Big O gives an upper bound (the function grows no faster than...), Big Omega gives a lower bound (the function grows at least as fast as...). When a function is both O(g(n))O(g(n)) and Ω(g(n))\Omega(g(n)), it is Θ(g(n))\Theta(g(n)), meaning it grows at exactly the same rate as g(n)g(n) up to constant factors.

Worked Example

Problem: Prove that f(n)=3n2+5nf(n) = 3n^2 + 5n is in Ω(n2)\Omega(n^2).
Step 1: We need to find constants c>0c > 0 and n0>0n_0 > 0 such that 3n2+5ncn23n^2 + 5n \geq c \cdot n^2 for all nn0n \geq n_0.
3n2+5ncn23n^2 + 5n \geq c \cdot n^2
Step 2: Since 5n05n \geq 0 for all n1n \geq 1, we know 3n2+5n3n23n^2 + 5n \geq 3n^2 for all n1n \geq 1. So choose c=3c = 3 and n0=1n_0 = 1.
3n2+5n3n2n13n^2 + 5n \geq 3n^2 \quad \forall\, n \geq 1
Step 3: The inequality f(n)3n2f(n) \geq 3 \cdot n^2 holds for all n1n \geq 1, satisfying the definition.
Answer: f(n)=3n2+5nΩ(n2)f(n) = 3n^2 + 5n \in \Omega(n^2), witnessed by c=3c = 3 and n0=1n_0 = 1.

Why It Matters

Big Omega notation is essential in algorithm analysis for proving that a problem requires at least a certain amount of resources. For instance, comparison-based sorting has a proven lower bound of Ω(nlogn)\Omega(n \log n), meaning no comparison sort can do better in the worst case. Understanding lower bounds helps you recognize when an algorithm is already optimal.

Common Mistakes

Mistake: Confusing Big Omega (lower bound) with Big O (upper bound), or thinking Ω(n2)\Omega(n^2) means the function grows at exactly the rate n2n^2.
Correction: Ω(n2)\Omega(n^2) only guarantees the function grows at least as fast as n2n^2. A function that is Ω(n2)\Omega(n^2) could also be Ω(n)\Omega(n), since n2n^2 certainly grows at least as fast as nn. For a tight bound, you need Θ\Theta.