Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the difference between recursion, memoization & dynamic programming? [duplicate]

Possible Duplicate:
Dynamic programming and memoization: top-down vs bottom-up approaches

I have gone through a lot of articles on this but can't seem to make sense of it. At times recursion and dynamic programming looks the same and at others memoization & dynamic programming look alike. Can someone explain to me what's the difference?

P.S. It will also be helpful if you could point me to some code using the three approaches on the same problem. (e.g. the Fibonacci series problem, I think every article I read used recursion but referred to it as dynamic programming)

like image 663
Sakibul Alam Avatar asked Aug 26 '12 20:08

Sakibul Alam


People also ask

How is recursion different from memoization?

As I'll show in an example below, a recursive function might end up performing the same calculation with the same input multiple times. This means it could end up taking longer than the iterative alternative. A memoization function allows us to store input alongside the result of the calculation.

Is memoization only for recursion?

Memoization is just a caching method that happen to be commonly used to optimize recursion. It cannot replace recursion.

Is recursion with memoization the same as dynamic programming?

Recursion and Dynamic ProgrammingDynamic programming is nothing but recursion with memoization i.e. calculating and storing values that can be later accessed to solve subproblems that occur again, hence making your code faster and reducing the time complexity (computing CPU cycles are reduced).

What is the main difference between recursion and dynamic programming?

What Is the Difference Between Dynamic Programming and Recursion? Recursion is when a function can be called and executed by itself, while dynamic programming is the process of solving problems by breaking them down into sub-problems to resolve the complex one.


1 Answers

Consider calculating the fibonacci sequence:

Pure recursion:

int fib(int x) {     if (x < 2)         return 1;      return fib(x-1) + fib(x-2); } 

results in exponential number of calls.

Recursion with memoization/DP:

int fib(int x) {     static vector<int> cache(N, -1);      int& result = cache[x];      if (result == -1)     {         if (x < 2)             result = 1;         else             result = fib(x-1) + fib(x-2);     }      return result; } 

Now we have linear number of calls the first time, and constant thereafter.

The above method is called "lazy". We calculate the earlier terms the first time they are asked for.

The following would also be considered DP, but without recursion:

int fibresult[N];  void setup_fib() {     fibresult[0] = 1;     fibresult[1] = 1;     for (int i = 2; i < N; i++)        fibresult[i] = fibresult[i-1] + fibresult[i-2]; }  int fib(int x) { return fibresult[x]; } 

This way may be described as "eager", "precaching" or "iterative". Its faster overall but we have to manually figure out the order the subproblems need to be calculated in. This is easy for fibonacci, but for more complex DP problems it gets harder, and so we fall back to the lazy recursive method if it is fast enough.

Also the following is neither recursion nor DP:

int fib(int x) {     int a = 1;     int b = 1;     for (int i = 2; i < x; i++)     {         a = a + b;         swap(a,b);     }     return b; } 

It uses constant space and linear time.

Also I will mention for the sake of completeness there is a closed form for fibonacci that uses nether recursion nor DP that allows us to calculate in constant time the fibonacci term using a mathematic formula based on the golden ratio:

http://www.dreamincode.net/forums/topic/115550-fibonacci-closed-form/

like image 70
Andrew Tomazos Avatar answered Oct 17 '22 06:10

Andrew Tomazos