Big-O, Ω & Θ
Algorithms · Asymptotic Analysis
Core Concepts
Tap any topic to expand — plain English first, then the math
01 Big-O, Ω, and Θ — what do they actually mean? +

Think of two runners: f and g. These three notations are just asking who wins in a long race?

🏃 f = O(g) — "f is at most as fast as g." f doesn't outrun g (up to a constant). g is an upper bound.

🏃 f = Ω(g) — "f is at least as fast as g." g is a lower bound for f.

🏃 f = Θ(g) — "f and g are neck and neck." Same speed. Both O and Ω hold.

Formally, f = O(g) means: there exist positive constants c and n₀ such that f(n) ≤ c·g(n) for all n ≥ n₀. You're allowed to scale g by any constant — you just can't let it go below f forever.

Worked Example
Is 3n² + 5n = O(n²)?
1.We need to find c, n₀ so that 3n² + 5n ≤ c·n²
2.Factor: 3n² + 5n = n²(3 + 5/n). As n grows, 5/n shrinks toward 0.
3.For n ≥ 1: 5/n ≤ 5, so 3n² + 5n ≤ 8n²
✓ c = 8, n₀ = 1 works. So 3n² + 5n = O(n²)
4.Also clearly ≥ 3n², so it's Ω(n²) too → Θ(n²)
02 Log bases don't matter (but exponential bases do!) +

This trips everyone up. All log bases are asymptotically identical — log₂n, log₃n, log₁₀n are all Θ(log n). This is because changing the base just multiplies by a constant, and constants vanish in Big-O.

log_a(n) = log_b(n) / log_b(a) → just a constant factor apart
Logs: base doesn't matter. Exponentials: base absolutely matters.
2ⁿ vs 3ⁿ are NOT equivalent — lim(2/3)ⁿ = 0, so 2ⁿ = o(3ⁿ).
Worked Example — from your problem set
Is f(n) = 2log₃n + 3log₅n + 4log₇n equal to Ω(log n)?
1.Convert each term: log₃n = log n / log 3 = c₁ · log n
2.Same for log₅n = c₂ · log n and log₇n = c₃ · log n
3.So f(n) = (2c₁ + 3c₂ + 4c₃) · log n = some constant × log n
f(n) = Θ(log n) → YES, f = Ω(log n). The claim "f ≠ Ω(log n)" is FALSE.
FALSE
03 Sorting lower bound — why ⌈log₂(n!)⌉? +

Imagine you want to sort a deck of n cards. Every comparison you make is a yes/no question. You need to ask enough questions to figure out which of the n! possible orderings is the right one.

Think of 20 Questions. To identify 1 of n! arrangements, a binary decision tree needs to have at least n! leaves — which means it needs depth at least log₂(n!). That depth = number of comparisons.
Why this equals Θ(n log n)
Stirling's Approximation shortcut
1.n! ≈ (n/e)ⁿ · √(2πn) [Stirling's formula — just accept it]
2.log₂(n!) ≈ log₂((n/e)ⁿ) = n · log₂(n/e) = n·log₂n − n·log₂e
3.The n·log₂e term is just O(n) — dominated by n·log n
log₂(n!) = Θ(n log n)

So any comparison sort needs Ω(n log n) comparisons. Merge sort achieves O(n log n) → it's optimal!

04 n^(log_k n) — superpolynomial growth +

The tricky one. n^(log_k n) looks polynomial but it isn't — because the exponent itself grows with n.

n² has a fixed exponent. n^(log n) has an exponent that keeps growing. It's like a polynomial whose degree gets bigger and bigger — it'll eventually beat n^100, n^1000, any n^c.
Worked Example — from your problem set
Is n^(log_k n) = O(n^c) for any constant c, when ln k > 0?
1.Rewrite: n^(log_k n) = e^(log_k(n) · ln n)
2.The exponent = log_k(n) · ln n = (ln n / ln k) · ln n = (ln n)² / ln k
3.For n^c: exponent = c · ln n
4.Compare: (ln n)²/ln k vs c · ln n. Divide both by ln n → ln n / ln k vs c. Since ln n → ∞, the left side wins.
n^(log_k n) grows faster than n^c for every fixed c → NOT O(n^c)
FALSE — the statement is wrong
05 f + g = O(h) → each is O(h) separately? +

Yes — and the intuition is simple. If the total of two positive things is bounded by h, then each one on its own must also be bounded by h.

If two friends together spent at most $100 today, and neither spent negative money, then each one individually spent at most $100.
Proof sketch
f + g = O(h) ⟹ f = O(h) and g = O(h)
1.We know: f(n) + g(n) ≤ c · h(n) for all large n
2.Since g(n) > 0: f(n) ≤ f(n) + g(n) ≤ c · h(n)
3.Same logic: g(n) ≤ f(n) + g(n) ≤ c · h(n)
Both f = O(h) and g = O(h) hold with the same constant c.
TRUE
Note: The "asymptotically positive" condition matters — it rules out negative functions that could inflate the sum artificially.
06 √f = O(√g) when f = O(g)? And 2^f = O(2^g)? +

Square roots: yes. Exponentials: no. Here's why:

√f = O(√g) — TRUE
1.f ≤ c · g (given)
2.Take square roots of both sides (valid since both are positive)
√f ≤ √c · √g → √f = O(√g) ✓
2^f = O(2^g) — FALSE (counterexample)
1.Let f(n) = n, g(n) = 2n. Then f = O(g) ✓
2.But 2^f = 2ⁿ and 2^g = 4ⁿ. Is 2ⁿ = O(4ⁿ)? Yes actually here...
3.Better: f = O(g) doesn't give 2^f = Ω(2^g). Let f=1, g=n. 2^1 = 2, 2^n grows → 2^f ≠ Ω(2^g).
2^f(n) = Ω(2^g(n)) is what's being claimed — and that's FALSE.
FALSE (the problem asks about Ω, not O)
Exam Strategy
End-to-end approach for every type of asymptotic question
1
Identify what you're comparing
Is it f vs g directly? Or a transformation of f and g (like √f vs √g, or 2^f vs 2^g)?
Write down the functions clearly before doing anything else.
2
Simplify using the hierarchy
Drop lower-order terms. Identify the dominant term.

Growth order (slow → fast):
1 < log n < √n < n < n log n < n² < n³ < 2ⁿ < n! < nⁿ
3
Use the limit test
Compute L = lim(n→∞) f(n)/g(n):

Limit LConclusion
L = 0f = o(g) → f = O(g) but f ≠ Ω(g)
0 < L < ∞f = Θ(g) → f is both O(g) and Ω(g)
L = ∞f = ω(g) → f = Ω(g) but f ≠ O(g)
Use L'Hôpital if you get 0/0 or ∞/∞.
4
For "prove or disprove" questions
To PROVE f = O(g): Construct explicit c and n₀. Show the algebra.
To DISPROVE: Find a counterexample, or show the limit is ∞ (or 0 when Ω is claimed).

For n! use Stirling: log(n!) = Θ(n log n)
For logs use log_a(n) = log(n)/log(a) — bases are just constants
For exponentials: aⁿ vs bⁿ — compare bases directly
5
Fill-in-the-blank table: systematic check
For each f, g pair:
1. Find dominant terms → classify both as Θ(something)
2. If f = Θ(n^a) and g = Θ(n^b): compare a vs b
3. Fill O, Ω, Θ based on the relationship

RelationshipO?Ω?Θ?
f grows slower✓ Yes✗ No✗ No
f grows same rate✓ Yes✓ Yes✓ Yes
f grows faster✗ No✓ Yes✗ No
6
Write up cleanly
Always state: what you're claiming, the constants you're using, and why the inequality holds. One clean line of math beats a paragraph of words.
Quick cheatsheet:
• log bases → same class · exponential bases → different class
• n^(log n) → superpolynomial (beats any n^c)
• n! grows faster than aⁿ for any constant a
• log(n!) = Θ(n log n) [Stirling]
• f+g=O(h) ⟹ each term is O(h) [if both positive]
• √f = O(√g) when f = O(g) · 2^f = Ω(2^g) is NOT guaranteed
Practice Quiz
Each answer shows a plain-English explanation
1 / 12
Score: 0
FINAL SCORE
Fill in the Blanks
Click a cell to reveal — test yourself first!
f(n) g(n) f = O(g)? f = Ω(g)? f = Θ(g)?
Why each answer:
Visual Intuition
See how these functions actually behave — hover over charts for values
Growth Rate Hierarchy
How the main complexity classes compare as n grows
x-axis range:
Hover over the chart to read values
What Big-O Actually Means
f = O(g) means c·g(n) is an upper bound on f(n) for large n
Example:
Log Bases are Equivalent
log₂n, log₃n, log₁₀n differ only by a constant multiplier
The ratio log₂(n) / log₁₀(n) = log₂(10) ≈ 3.32 — a constant. Big-O ignores constants, so all log bases are Θ(log n).
Factorials vs Exponentials
n! eventually dominates any fixed aⁿ — shown in log scale
Log scale used so they all fit. Steeper slope = faster growth. Notice n! pulls away from 3ⁿ around n=6.