Big-O notation provides a formal way to describe the upper bound on an algorithm’s resource usage (time or space) as a function of input size $n$.
Formal definition:
$f(n) = O(g(n))$ if there exist constants $c > 0$ and $n_0 > 0$ such that:
$$f(n) \leq c \cdot g(n) \quad \text{for all } n \geq n_0$$
In plain English: $f(n)$ grows no faster than $g(n)$ for large enough $n$.
KEY TAKEAWAY: Big-O describes the worst-case growth rate. It ignores constants and lower-order terms, focusing on the dominant behaviour for large inputs.
Worst-case complexity is the maximum number of operations the algorithm performs over all inputs of size $n$.
Example: Linear search on array $A$ of size $n$:
- Best case: Target is $A[0]$ → 1 comparison = $O(1)$
- Average case: Target is at position $n/2$ → $n/2$ comparisons = $O(n)$
- Worst case: Target not found → $n$ comparisons = $O(n)$
VCE Algorithmics focuses on worst-case unless stated otherwise.
When computing Big-O, apply these rules:
Ordering of common growth rates (slowest to fastest):
$$O(1) < O(\log n) < O(n) < O(n \log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)$$
| Algorithm | Time complexity | Why |
|---|---|---|
Array access A[i] |
$O(1)$ | Direct memory access |
| Binary search | $O(\log n)$ | Halves search space each step |
| Linear search | $O(n)$ | May scan every element |
| Mergesort | $O(n \log n)$ | $\log n$ levels, $O(n)$ work per level |
| Bubble sort | $O(n^2)$ | Two nested loops |
| Floyd-Warshall | $O(n^3)$ | Three nested loops |
| Brute-force TSP | $O(n!)$ | Enumerate all permutations |
With a modern computer doing $10^8$ operations/second:
| Complexity | $n = 100$ | $n = 10^4$ | $n = 10^6$ |
|---|---|---|---|
| $O(n)$ | instant | instant | instant |
| $O(n \log n)$ | instant | instant | $\approx$ 0.2s |
| $O(n^2)$ | instant | $\approx$ 1s | $\approx$ 167 min |
| $O(2^n)$ | long | astronomical | impossible |
ALGORITHM Mystery(A, n)
result ← 0
for i ← 0 to n-1 do // n iterations
for j ← i to n-1 do // n-i iterations
result ← result + A[i] * A[j]
end for
end for
return result
Iterations: $(n) + (n-1) + \ldots + 1 = n(n+1)/2$
Time complexity: $O(n^2)$
EXAM TIP: In exams, show your working when determining Big-O. Count loop iterations explicitly, state the dominant term, and drop constants. Simply writing $O(n^2)$ without justification may not receive full marks.
COMMON MISTAKE: Confusing Big-O (upper bound / worst case) with the exact running time. An algorithm with $T(n) = 3n + 7$ is $O(n)$ — not $O(3n)$ or $O(n + 7)$.