Understand
Rank the following complexity classes from fastest-growing to slowest-growing: O(n²), O(1), O(2ⁿ), O(n log n), O(n), O(log n). At n = 1,000, compute the approximate operation count for each.
An algorithm runs in exactly 3n² + 14n + 7 steps.
What is its Big-O class? Explain why you discard the
14n + 7 terms.
Explain in one sentence why Big-O notation discards constant factors. Why is this useful despite being imprecise?
Apply
# Fragment A
for i in range(n):
for j in range(10): # Note: j goes to 10, not n
do_work()
# Fragment B
i = n
while i > 1:
i = i // 2
do_work()
# Fragment C
for i in range(n):
for j in range(i, n):
for k in range(j, n):
do_work()A sorting algorithm claims to be O(n log n). At n = 100, it takes 0.1ms. Predict its runtime at n = 1,000,000. At n = 1,000,000,000. Show your work using the ratio method: if T(n) = c · n log n, then T(m)/T(n) = (m log m)/(n log n).
The following function is claimed to be O(n). Identify the hidden O(n) operation that makes the actual complexity O(n²).
def find_unique(items: list[str]) -> list[str]:
unique = []
for item in items:
if item not in unique: # What is the complexity of this check?
unique.append(item)
return uniqueDesign
A team is building a feature that checks whether a username is taken. Option A: scan the list of all usernames on each check (O(n)). Option B: maintain a hash set of usernames and check membership (O(1) average). The current user count is 50,000. The service handles 10,000 username checks per second. At what user count does Option A become unacceptable if the latency budget per check is 1ms, and the list scan performs 10⁸ comparisons per second?
Design a data structure that supports O(1) insert, O(1) delete, and O(1) get-random-element. Describe the structure, the operations, and prove each operation’s complexity. (Hint: use both a hash map and a dynamic array in combination.)