Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the big-O complexity of this recursive algorithm?

I am following a course of Algorithms and Data Structures.

Today, my professor said the complexity of the following algorithm is 2n.

I waited till the lesson was over, approached him and told him I actually believed it was an O(n) algorithm, and I did the computation to prove it, and wanted to show them to it, but he continued to say it was not, without giving me any convincing explanation.

The algorithm is recursive, and it has this complexity:

       { 1         if n=1
T(n) = {
       { 2T(n/2)   otherwise

I computed it down to be a O(n), this way:

Let's expand T(n)

T(n) = 2 [2 * T(n/(2^2))]
     = 2^2 * T(n/(2^2))
     = 2^2 * [2 * T(n/(2^3))]
     = 2^3 * T(n/(2^3))
     = ...
     = 2^i * T(n/(2^i)).

We stop when the term inside the T is 1, that is:

n/(2i) = 1 ==> n = 2i ==> i = log n

After the substitution, we obtain

T(n) = 2^log n * T(1)
     = n * 1
     = O(n).

Since this algorithm jumped out of a lesson on Merge Sort, I noted how Merge Sort, which notoriously is O(n log n) has a complexity of 2T(n/2) + Θ(n) (obviously higher than 2T(n/2)), and I asked him why is it, that an algorithm with a lower complexity, gets a higher big-O. Because, at this point, it's counter intuitive for me. He replied, words for words, "If you think that is counter-intuitive, you have serious problem in your math."

My questions are:

  1. Is there any fallacy in my demonstration?
  2. Wouldn't the last situation be counter-intuitive?

Yes, this is also a vent.

like image 833
doplumi Avatar asked Mar 15 '23 11:03

doplumi


1 Answers

Proof - 1

This recurrence falls in case - 3 of Master Theorem, with

  • a = 2;
  • b = 2; and,
  • c = -∞

and thus Logba = 1 which is bigger than -∞. Therefore the running time is Θ(n1) = Θ(n).


Proof - 2

Intuitively, you are breaking the problem of size n into 2 problems of size n/2 and the cost to join the result of two sub-problems is 0 (i.e. there is no constant component in the recurrence).

Hence at the bottom-most level you have n problems of cost 1 each, resulting in the running time of n * O(1) which is equal to O(n).


Edit: Just to complete this answer I will also add the answers to specific questions asked by you.

Is there any fallacy in my demonstration?

No. It is correct.

Wouldn't the last situation be counter-intuitive?

Definitely it is counter-intuitive. See the Proof-2 above.

like image 57
displayName Avatar answered Mar 19 '23 00:03

displayName