Think of two runners: f and g. These three notations are just asking who wins in a long race?
Formally, f = O(g) means: there exist positive constants c and n₀ such that f(n) ≤ c·g(n) for all n ≥ n₀. You're allowed to scale g by any constant — you just can't let it go below f forever.
This trips everyone up. All log bases are asymptotically identical — log₂n, log₃n, log₁₀n are all Θ(log n). This is because changing the base just multiplies by a constant, and constants vanish in Big-O.
Imagine you want to sort a deck of n cards. Every comparison you make is a yes/no question. You need to ask enough questions to figure out which of the n! possible orderings is the right one.
So any comparison sort needs Ω(n log n) comparisons. Merge sort achieves O(n log n) → it's optimal!
The tricky one. n^(log_k n) looks polynomial but it isn't — because the exponent itself grows with n.
Yes — and the intuition is simple. If the total of two positive things is bounded by h, then each one on its own must also be bounded by h.
Square roots: yes. Exponentials: no. Here's why:
1 < log n < √n < n < n log n < n² < n³ < 2ⁿ < n! < nⁿ
| Limit L | Conclusion |
|---|---|
| L = 0 | f = o(g) → f = O(g) but f ≠ Ω(g) |
| 0 < L < ∞ | f = Θ(g) → f is both O(g) and Ω(g) |
| L = ∞ | f = ω(g) → f = Ω(g) but f ≠ O(g) |
Stirling: log(n!) = Θ(n log n)log_a(n) = log(n)/log(a) — bases are just constantsaⁿ vs bⁿ — compare bases directly
| Relationship | O? | Ω? | Θ? |
|---|---|---|---|
| f grows slower | ✓ Yes | ✗ No | ✗ No |
| f grows same rate | ✓ Yes | ✓ Yes | ✓ Yes |
| f grows faster | ✗ No | ✓ Yes | ✗ No |
| f(n) | g(n) | f = O(g)? | f = Ω(g)? | f = Θ(g)? |
|---|