Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Order of growth of specific recursive function

What is the order of growth of the following function?

    static int counter = 0;

    static void Example(int n)
    {
        if (n == 1) return;

        for (int i = 0; i < n; i++)
        {
            counter++;
        }

        Example(n / 2);
    }

In order to solve it I have done the following:

I first got rid of the inner for loop in order to end up with:

    static void Example(int n)
    {
        if (n == 1) return;            

        counter++;

        Example(n / 2);
    }

By analyzing that I was able to tell that the order of growth of that method was log base 2 of n .

so I am thinking that the final answer will be log base 2 times something else. How can I figure it out?

like image 610
Tono Nam Avatar asked Mar 01 '12 00:03

Tono Nam


People also ask

Which order the recursive functions are?

Expert-verified answer The correct answer is Last In First Out order. Explanation: The Recursive function is used for the function that is referred as execution of itself.

Is a recursive function higher order?

A higher order function is a function that can take conditions or functions as arguement. And it can optionally output a function as the return statement . therefore ,recursive functions are not all higher level functions.

What is order of growth of an algorithm?

The order of growth of an algorithm is an approximation of the time required to run a computer program as the input size increases. The order of growth ignores the constant factor needed for fixed operations and focuses instead on the operations that increase proportional to input size.


1 Answers

Let's look at both the original and the modified function. In the original function, you have the following amount of work done:

  • A constant amount of work for the base case check.
  • O(n) work to count up the number.
  • The work required for a recursive call to something half the size.

We can express this as a recurrence relation:

  • T(1) = 1
  • T(n) = n + T(n / 2)

Let's see what this looks like. We can start expanding this out by noting that

T(n) = n + T(n / 2)

= n + (n / 2 + T(n / 4)

= n + n/2 + T(n / 4)

= n + n/2 + (n / 4 + T(n / 8))

= n + n/2 + n/4 + T(n / 8)

We can start to see a pattern here. If we expand out the T(n / 2) bit k times, we get

T(n) = n + n/2 + n/4 + ... + n/2k + T(n/2k)

Eventually, this stops when n / 2k = 1. When this happens, we have

T(n) = n + n/2 + n/4 + n/8 + ... + 1

What does this evaluate to? Interestingly, this sum is equal to 2n + 1, because the sum n + n/2 + n/4 + n/8 + ... = 2n. Consequently, this first function is O(n). We could have also arrived at this conclusion by using the Master Theorem, but it's interesting to see this approach as well.


So now let's look at the new function. This one does

  • A constant amount of work for the base case check.
  • The work required for a recursive call to something half the size.

We can write this recurrence as

  • T(1) = 1
  • T(n) = 1 + T(n / 2)

Using the same trick as before, let's expand out T(n):

T(n) = 1 + T(n / 2)

= 1 + 1 + T(n / 4)

= 1 + 1 + 1 + T(n / 8)

More generally, after expanding out k times, we get

T(n) = k + T(n / 2k)

This stops when n / 2k = 1, which happens when k = log2 n (that is, lg n). In this case, we get that

T(n) = lg n + T(1) = lg n + 1

So T(n) = O(lg n) in this case.

Hope this helps!

like image 99
templatetypedef Avatar answered Oct 23 '22 17:10

templatetypedef