Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Big O for recursion function

I know that when you split the size of the problem set into a specified fraction you're dealing with O(log(n)). However I am confused when they're more then 1 recursive calls that do this. For example in this function over here we're going to calculate the value of an exponent.

    public static long pow(long x, int n)
    {
      if(n==1)
        return x;
       if(isEven(n))
         return pow(x,n/2) * pow(x,n/2);
       else
        return pow(x*x, n/2) * x
      }

After doing the analysis, I got the run time equals O(N). Am I correct? Thanks for your time

like image 443
Varun Gorantla Avatar asked Sep 19 '15 16:09

Varun Gorantla


3 Answers

Yes, you are correct, at least under worst case analysis.

Note that for n = 2^k, for some natural k, you get that except when reaching the stop clause, the condition is always true, and the recursive function will be ran twice.

When that is established, it is enough to analyze:

T(n) = T(n/2) + T(n/2) + X

Where X is some constant, (each recursive call takes constant amount of work, if ignoring the other recursive calls).

From master theorem case 1, with:

f(n) = X
a = 2, b = 2, c = 0 (since X is in O(n^0)=O(1))

And since c=0 < 1 = log_2(2), conditions for case 1 apply, and we can conclude the function T(n) is in Theta(n^log_2(2)) = Theta(n)

Average case analysis:

For average case, with uniformly distributed number n, half of the bits in the binary representation of this number will be up (1), and half will be 'down' (0).

Since dividing by 2 is basically arithmetic shift right, and the condition isEven(n) is true if and only if the least significant bit is 0, this means the mean complexity function is:

T(n) = 0.5 T(n/2) + 0.5 * 2 T(n/2) + X = 0.5 * 3 * T(n/2)  + X 
     = 1.5 * T(n/2) + X

So

T(n) = 3/2 T(n/2) + X

Case 1 still applies (assuming constant X):

a = 3/2, b=2, c = 0

and you get average case complexity of Theta(n^log_1.5(2))~=Theta(n^0.58)

Quick Note:

This assumes indeed all arithmetics are O(1). If this is not the case (very large numbers), you should put their complexity instead of the constant in the definition of T(n), and reanalyze. Assuming each such operation is sub-linear (in the number, not the bits representing it), the result remains Theta(n), since case 1 of master theorem still applies. (For average case, for any function "better" than ~O(n^0.58) won't change the result shown

like image 134
amit Avatar answered Oct 09 '22 04:10

amit


Varun, you are partly correct. Let's see the two cases:

  1. If n is even, then you are just dividing the task into two halves without optimizing significantly, since pow(x, n/2) is calculated twice.

  2. If n is odd, then you have a special case of decomposition. x will be replaced by xx, which makes xx the new base, which saves you from recalculating x*x.

In the first case we have a slight optimization, since it is easier to repeat smaller productions than doing the whole thing, as:

(3 * 3 * 3 * 3) * (3 * 3 * 3 * 3) is easier to be calculated than (3 * 3 * 3 * 3 * 3 * 3 * 3 * 3), so the first case slightly improves the calculation by using the fact that production is an associative operation. The number of production executed in the first case is not reduced, but the way the computation is done is better.

In the second case, however, you have significant optimizations. Let's suppose that n = (2^p) - 1. In that case, we reduce the problem to a problem, where n = ((2^p - 1) - 1) / 2 = ((2^p) - 2) / 2 = (2^(p-1)) - 1. So, if p is a natural number and n = (2^p) - 1, then you are reducing it to its half. So, the algorithm is logarithmic in best case scenario n = (2^p) - 1 and it is linear in worst case scenario n = 2^p.

like image 20
Lajos Arpad Avatar answered Oct 09 '22 04:10

Lajos Arpad


We usually analyze worst case time complexity, which happens when isEven(n) is true. In that case, we have

T(n) = 2T(n/2) + O(1)

where T(n) means the time complexity of pow(x, n).

Apply Master theorem, Case 1 to get the Big-O notation form of T(n). That is:

T(n) = O(n)

 

like image 44
Meng Wang Avatar answered Oct 09 '22 02:10

Meng Wang