Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big O of Recursive Methods

I'm having difficulty determining the big O of simple recursive methods. I can't wrap my head around what happens when a method is called multiple times. I would be more specific about my areas of confusion, but at the moment I'm trying to answer some hw questions, and in lieu of not wanting to cheat, I ask that anyone responding to this post come up with a simple recursive method and provide a simple explanation of the big O of said method. (Preferably in Java... a language I'm learning.)

Thank you.

like image 425
user1333461 Avatar asked Jul 31 '12 22:07

user1333461


People also ask

What is the big O notation for a recursive function?

The big-O runtime for a recursive function is equivalent to the number of recursive function calls. This value varies depending on the complexity of the algorithm of the recursive function. For example, a recursive function of input N that is called N times will have a runtime of O(N).

What is the time complexity of a recursive method?

The time complexity of recursion depends on two factors: 1) Total number of recursive calls 2) Time complexity of additional operations for each recursive call. So recursion tree is a diagram to represent the additional cost for each recursive call in terms of input size n.

Does recursion reduce time complexity?

Recursion can reduce time complexity. An example of this is calculating fibonacci numbers. If you calculate the fibonacci sequence up to a number n using recursion rather than iteration, the time to complete the task when compared to that of the iterative approach was much greater.


2 Answers

You can define the order recursively as well. For instance, let's say you have a function f. To calculate f(n) takes k steps. Now you want to calculate f(n+1). Lets say f(n+1) calls f(n) once, then f(n+1) takes k + some constant steps. Each invocation will take some constant steps extra, so this method is O(n).

Now look at another example. Lets say you implement fibonacci naively by adding the two previous results:

fib(n) = { return fib(n-1) + fib(n-2) }

Now lets say you can calculate fib(n-2) and fib(n-1) both in about k steps. To calculate fib(n) you need k+k = 2*k steps. Now lets say you want to calculate fib(n+1). So you need twice as much steps as for fib(n-1). So this seems to be O(2^N)

Admittedly, this is not very formal, but hopefully this way you can get a bit of a feel.

like image 88
markijbema Avatar answered Sep 30 '22 02:09

markijbema


You might want to refer to the master theorem for finding the big O of recursive methods. Here is the wikipedia article: http://en.wikipedia.org/wiki/Master_theorem

You want to think of a recursive problem like a tree. Then, consider each level of the tree and the amount of work required. Problems will generally fall into 3 categories, root heavy (first iteration >> rest of tree), balanced (each level has equal amounts of work), leaf heavy (last iteration >> rest of tree).

Taking merge sort as an example:

define mergeSort(list toSort):
    if(length of toSort <= 1):
        return toSort
    list left = toSort from [0, length of toSort/2)
    list right = toSort from [length of toSort/2, length of toSort)
    merge(mergeSort(left), mergeSort(right))

You can see that each call of mergeSort in turn calls 2 more mergeSorts of 1/2 the original length. We know that the merge procedure will take time proportional to the number of values being merged.

The recurrence relationship is then T(n) = 2*T(n/2)+O(n). The two comes from the 2 calls and the n/2 is from each call having only half the number of elements. However, at each level there are the same number of elements n which need to be merged, so the constant work at each level is O(n).

We know the work is evenly distributed (O(n) each depth) and the tree is log_2(n) deep, so the big O of the recursive function is O(n*log(n)).

like image 26
Calvin Jia Avatar answered Sep 30 '22 01:09

Calvin Jia