The Computing Series

The Concept

Big-O notation describes an upper bound on an algorithm’s growth rate. It captures the dominant term of the resource function and discards constant factors and lower-order terms.

Formally: f(n) is O(g(n)) if there exist constants c > 0 and n₀ ≥ 0 such that f(n) ≤ c · g(n) for all n ≥ n₀.

In engineering terms: O(g(n)) means “grows no faster than g(n) for large n, ignoring constants.”

Why Discard Constants?

Constants depend on hardware, compiler, language runtime, and memory layout. An algorithm that does 1,000 simple operations per element and one that does 1 operation per element are both O(n). On different hardware, the ratio between them shifts. But both remain O(n) forever — their growth shape does not change.

The growth shape is what determines scalability. Constants determine absolute speed on specific hardware. For system design, growth shape matters more.

The Common Complexity Classes

The six complexity classes you will encounter most often, ordered by growth rate:

O(1) — Constant. The algorithm takes the same time regardless of input size. Array index lookup. Hash table get (average case). Stack push/pop.

O(log n) — Logarithmic. The algorithm’s work increases by 1 unit each time n doubles. Binary search. B-tree lookup. Finding the height of a balanced binary tree.

O(n) — Linear. The algorithm visits each element a constant number of times. Linear scan. Finding the maximum of an unsorted array. Reading a file.

O(n log n) — Linearithmic. The dominant complexity class for comparison-based sorting. Merge sort. Heapsort. Fast Fourier Transform. The best possible complexity for sorting arbitrary data.

O(n²) — Quadratic. The algorithm compares every element to every other element. Bubble sort. Insertion sort (worst case). Naive string matching.

O(2ⁿ) — Exponential. The algorithm’s runtime doubles with each additional element. Naive recursive Fibonacci. Generating all subsets of a set. Brute-force traveling salesman.

Visual Intuition: What These Mean at Scale

The following table shows approximate operation counts for common complexity classes. At n = 1 billion, the difference between O(log n) and O(n²) is the difference between 30 operations and 1,000,000,000,000,000,000 (10¹⁸) operations.

Complexity n = 1,000 n = 1,000,000 n = 1,000,000,000
O(1) 1 1 1
O(log n) 10 20 30
O(n) 1,000 1,000,000 1,000,000,000
O(n log n) 10,000 20,000,000 30,000,000,000
O(n²) 1,000,000 10¹² 10¹⁸
O(2ⁿ) 10³⁰¹

At 10⁹ operations per second (a rough estimate for a modern CPU), O(n) at n = 10⁹ takes about 1 second. O(n²) at n = 10⁹ takes about 31 years.


Read in the book →