Ch8.8.1: Master Theorem (Appendix)
Overview
Many algorithms are defined recursively. To analyze their time complexity, we often write a recurrence relation describing how the running time depends on smaller subproblems.
The Master Theorem is a tool that solves many common recurrences of the form:
T(n) = a · T(n / b) + f(n)
where:
a= number of subproblemsb= factor by which the problem size shrinksf(n)= cost outside the recursive calls
This appendix teaches you how to compute and prove time complexity using the Master Theorem, with examples including mergesort and quicksort.
1. The Master Theorem
Consider the recurrence:
T(n) = a · T(n / b) + f(n)
Define the critical exponent:
n^(log_b(a))
Compare f(n) to n^(log_b(a)).
Case 1: f(n) is smaller
f(n) = O(n^(log_b(a) - ε))
for some ε > 0.
Then:
T(n) = Θ(n^(log_b(a)))
Case 2: f(n) matches the critical exponent
f(n) = Θ(n^(log_b(a)) · ln^k(n))
for some k ≥ 0.
Then:
T(n) = Θ(n^(log_b(a)) · ln^(k+1)(n))
Case 3: f(n) is larger
f(n) = Ω(n^(log_b(a) + ε))
for some ε > 0, and a regularity condition holds.
Then:
T(n) = Θ(f(n))
2. Example: Mergesort
Mergesort splits the array into two halves, recursively sorts each half, and then merges them in linear time.
T(n) = 2 · T(n / 2) + Θ(n)
Here:
a = 2b = 2f(n) = Θ(n)
Critical exponent:
n^(log_2(2)) = n
Since f(n) = Θ(n), this is Case 2.
T(n) = Θ(n · ln(n))
3. Example: Quicksort (average case)
In the average case, Quicksort splits the array into two subproblems of roughly equal size, and partitioning takes linear time.
T(n) = 2 · T(n / 2) + Θ(n)
This is the same recurrence as mergesort, so:
T(n) = Θ(n · ln(n))
This explains why Quicksort is fast on average.
4. Example: Quicksort (worst case)
In the worst case (pivot always smallest or largest), the recurrence becomes:
T(n) = T(n - 1) + Θ(n)
Expand the recurrence:
T(n) = n + (n - 1) + (n - 2) + ... + 1 = Θ(n^2)
This does not fit the Master Theorem, but it is easy to compute directly.
5. Example: Binary tree recursion
Consider a function that recursively processes a binary tree:
T(n) = 2 · T(n / 2) + Θ(1)
Here:
a = 2b = 2f(n) = Θ(1)
Critical exponent:
n^(log_2(2)) = n
Since f(n) = Θ(1) is smaller than n, this is
Case 1.
T(n) = Θ(n)
This matches the intuition: visiting each node once is linear time.
6. When the Master Theorem does not apply
The Master Theorem does not apply to:
- recurrences where subproblem sizes are not equal
- recurrences like
T(n) = T(n - 1) + T(n - 2) - recurrences with variable branching factors
- recurrences with non-polynomial
f(n)
For example, the Fibonacci recurrence:
T(n) = T(n - 1) + T(n - 2) + Θ(1)
grows as Θ(2^n), but this cannot be derived using the Master
Theorem.
Key takeaways
- The Master Theorem solves recurrences of the form
T(n) = a·T(n/b) + f(n). - Compare
f(n)ton^(log_b(a))to determine the case. - Case 1:
f(n)is smaller →Θ(n^(log_b(a))). - Case 2:
f(n)matches the critical exponent →Θ(n^(log_b(a)) · ln(n)). - Case 3:
f(n)is larger →Θ(f(n))(with regularity conditions). - Mergesort and average-case Quicksort both satisfy
T(n) = 2T(n/2) + Θ(n)→Θ(n·ln(n)). - Worst-case Quicksort is
Θ(n^2)and does not fit the theorem. - Not all recurrences can be solved with the Master Theorem (e.g., Fibonacci).
The Master Theorem is one of the most important tools for analyzing divide-and-conquer algorithms. With it, you can compute the time complexity of many recursive algorithms quickly and rigorously.