The Computing Series

How It Works

Deriving Big-O from Code

Three rules cover most cases:

Rule 1: Simple loops. A loop that runs n times is O(n).

Rule 2: Nested loops. A loop inside a loop is O(n²). Three nested loops are O(n³). Each nesting level multiplies by n.

Rule 3: Divide-and-conquer recursion. A recursive call that halves the input and does O(n) work at each level is O(n log n). A recursive call that halves the input and does O(1) work at each level is O(log n).

def demonstrate_complexity_classes(data: list[int]) -> None:
    """
    Demonstrates: deriving Big-O from loop structure.
    Each section is labeled with its complexity.
    """
    n = len(data)

    # O(1) — constant: no loop, fixed number of operations
    first_element = data[0] if data else None

    # O(log n) — binary search: input halves on each iteration
    low, high = 0, n - 1
    while low < high:
        low = (low + high) // 2 + 1  # Demonstrating halving pattern only
        # In real binary search, this finds the target

    # O(n) — linear scan: visits each element once
    running_sum = 0
    for value in data:
        running_sum += value

    # O(n log n) — sort: this is Python's Timsort, guaranteed O(n log n)
    sorted_data = sorted(data)

    # O(n²) — nested loops: every pair of elements
    pair_count = 0
    for i in range(n):
        for j in range(i + 1, n):
            # Compares every unique pair — n*(n-1)/2 comparisons
            pair_count += 1

    # O(2^n) — subset generation: 2^n subsets for n elements
    # Only safe for small n; do not call with n > 25
    def generate_subsets(elements: list[int]) -> list[list[int]]:
        if not elements:
            return [[]]
        first = elements[0]
        rest_subsets = generate_subsets(elements[1:])
        # Each existing subset appears twice: without first, and with first
        return rest_subsets + [[first] + subset for subset in rest_subsets]

Reading Complexity from Recursive Functions

For recursive functions, identify two things: how many recursive calls are made per call (branching factor), and how the input size changes per call.

def binary_search_recursive(sorted_array: list[int], target: int,
                             low: int, high: int) -> int:
    """
    Demonstrates: O(log n) recursive — one call per level, input halves each time.
    Recurrence: T(n) = T(n/2) + O(1)
    Solution: T(n) = O(log n)
    """
    if low > high:
        return -1  # Base case: search space exhausted

    mid = (low + high) // 2

    if sorted_array[mid] == target:
        return mid
    elif sorted_array[mid] < target:
        # One recursive call, on half the input — this is the halving step
        return binary_search_recursive(sorted_array, target, mid + 1, high)
    else:
        return binary_search_recursive(sorted_array, target, low, mid - 1)

One recursive call per level. Input halves. Depth of recursion: log₂(n). Total operations: O(log n).

Contrast with naive Fibonacci:

def fibonacci_naive(n: int) -> int:
    """
    Demonstrates: O(2^n) — two recursive calls per call, depth n.
    Do not call with n > 35; runtime grows exponentially.
    """
    if n <= 1:
        return n
    # Two recursive calls at every level — the call tree has 2^n nodes
    return fibonacci_naive(n - 1) + fibonacci_naive(n - 2)

Two recursive calls per call. Depth n. Total nodes in the call tree: approximately 2ⁿ. Each node does O(1) work. Total: O(2ⁿ).

Dropping Lower-Order Terms

When an algorithm has multiple phases, the dominant term determines the Big-O class.

An algorithm that runs O(n²) then O(n) steps is O(n²) overall. At large n, the O(n) phase is negligible.

def two_phase_algorithm(data: list[int]) -> list[int]:
    """
    Demonstrates: O(n²) + O(n) = O(n²) — dominated by the quadratic phase.
    """
    result = []

    # Phase 1: O(n²) — build all pairs
    for i in range(len(data)):
        for j in range(len(data)):
            result.append(data[i] + data[j])

    # Phase 2: O(n²) elements to sort — dominated by phase 1 creation
    # Phase 2 is O(n² log n²) = O(n² log n), which is worse than n² alone.
    # For the purposes of this demonstration, we stop after phase 1.
    return result

    # Total complexity: O(n²) — the nested loop dominates

Read in the book →