Divide-and-Conquer
Definition
- Divide-and conquer is a general algorithm design paradigm:
- Divide: divide the input data S in two or more disjoint subsets S1, S2, …
- Recur: solve the subproblems recursively
- Conquer: combine the solutions for S1, S2, …, into a solution for S
- The base case for the recursion are subproblems of constant size
- Analysis can be done using recurrence equations
- Application: in Merge-Sort, Quick-Sort
Recurrence Equation Analysis
- The conquer step of merge-sort consists of merging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes at most bn steps, for some constant b.
- Likewise, the basis case (n < 2) will take at b most steps.
- Therefore, if we let T(n) denote the running time of merge-sort:
- We can therefore analyze the running time of merge-sort by finding a closed form solution to the above equation.
- That is, a solution that has T(n) only on the left-hand side.
Iterative Substitution
- In the iterative substitution, or “plug-and-chug,” technique, we iteratively apply the recurrence equation to itself and see if we can find a pattern:
- Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n.
- So,
- Thus, T(n) is O(n log n).
The Recursion Tree
- Draw the recursion tree for the recurrence relation and look for a pattern:
Guess-and-Test Method
- In the guess-and-test method, we guess a closed form solution and then try to prove it is true by induction:
- Guess: T(n) < cn log n.
- Wrong: we cannot make this last line be less than cn log n
- Guess #2: T(n) < cn log2 n.
- if c > b.
- So, T(n) is O(n log2 n).
- In general, to use this method, you need to have a good guess and you need to be good at induction proofs.
Master Method
- Many divide-and-conquer recurrence equations have the form:
- The Master Theorem:
Examples
- T(n) = 4T(n / 2) + n
- Solution: logba=2, so case 1 says T(n) is O(n2).
- T(n) = 2T(n / 2) + nlog n
- Solution: logba=1, so case 2 says T(n) is O(n log2 n).
- T(n) = T(n / 3) + n log n
- Solution: logba=0, so case 3 says T(n) is O(n log n).
- T(n) = 8T(n / 2) + n2
- Solution: logba=3, so case 1 says T(n) is O(n3).
- T(n) = 9T(n / 3) + n3
- Solution: logba=2, so case 3 says T(n) is O(n3).
- T(n) = T(n / 2) +1 (binary search)
- Solution: logba=0, so case 2 says T(n) is O(log n).
- T(n) = 2T(n / 2) + log n (heap construction)
- Solution: logba=1, so case 1 says T(n) is O(n).
Iterative “Proof” of the Master Theorem
- Using iterative substitution, let us see if we can find a pattern:
- We then distinguish the three cases as
- The first term is dominant
- Each part of the summation is equally dominant
- The summation is a geometric series
Integer Multiplication
Algorithm: Multiply two n-bit integers I and J.
- Divide step: Split I and J into high-order and low-order bits
- We can then define I*J by multiplying the parts and adding:
- So, T(n) = 4T(n/2) + n, which implies T(n) is O(n2).
- But that is no better than the algorithm we learned in grade school.
An Improved Integer Multiplication Algorithm
Algorithm: Multiply two n-bit integers I and J.
- Divide step: Split I and J into high-order and low-order bits
Observe that there is a different way to multiply parts:
So, T(n) = 3T(n/2) + n, which implies T(n) is O(nlog 2 3), by the Master Theorem.
- Thus, T(n) is O(n1.585)
- Divide step: Split I and J into high-order and low-order bits