I was reading notes on Dynamic programming, and I encountered the following comment.
If the subproblems are not independent, i.e. subproblems share subsubproblems, then a divideand-conquer algorithm repeatedly solves the common subsubproblems. Thus, it does more work than necessary
What does this mean ? Can you give me examples to make the above point clear ?
The author refers to the fact that many divide-and-conquer algorithms have subproblems that overlap with one another. Consider, for example, this very simple Fibonacci implementation:
int Fibonacci(int n) {
if (n <= 1) return n;
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
If you trace out the calls done to compute Fibonacci(4), we get
In other words, 9 total function calls are made, but there's only five unique calls here (Fibonacci of 0 to 4, inclusive). This algorithm could be made much more efficient if the recursive calls were shared across the subproblems rather than recomputed from scratch each time. This is one of the key ideas behind dynamic programming.
To compute Fn (the nth Fibonacci number), the above code will make a total of 2Fn+1 - 1 recursive calls. Since the Fibonacci numbers grow exponentially quickly, this requires exponentially much work. However, it's possible to use the fact that many recursive calls are identical to simplify this dramatically. Rather than starting at Fibonacci(4) and working down, let's start at Fibonacci(0) and work up. Specifically, we'll build a table (let's call it FTable) of length 5 and will fill it in as follows:
Now, suppose we want to compute FTable[2]. This requires us to know FTable[0] and FTable[1], but we already do know that because they're in the table. We thus can set
Using similar logic, we can compute FTable[3] from FTable[2] and FTable[1]:
And FTable[4] from FTable[2] and FTable[3]:
Notice how we avoided making lots of overlapping recursive calls by just building them up in the reverse order! This now computes Fibonacci numbers in time O(n), which is exponentially faster than before. Using some more advanced math we can do even better than this, but this does illustrate why dynamic programming can take infeasible problems and make them suddenly feasible.
Hope this helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With