Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Space Complexity of Quick Sort

I have learnt that the space complexity of quick sort without Sedgewick's trick of eliminating tail recursion is O(n). But if we trace the calls on the stack that are stored, it is O(log n) steps at any call as shown in the figure. Stack calls while calculating the values at leaves

In the figure,

while calculating the value of (1,1) we store the calls of [(1,8), (1,4), (1,2)] ,

while calculating the value of (3,3) we store the calls of [(1,8), (1,4), (3,4)] and so on

which constitute for only O(log n) space at ant point of time. Then does the complexity become O(n) ?

like image 596
kaushalpranav Avatar asked Jul 20 '16 17:07

kaushalpranav


1 Answers

In the tree example you gave above, you showed a run of quicksort that always happens to pick the exact median element as the splitting point at each step. That makes the recursion depth O(log n) and so, as you noted, the space usage would be O(log n) even without the optimization.

But what happens if you get a bad run of quicksort? That is, what happens if you always pick the absolute biggest or absolute smallest element of the array as the pivot at each point? Then your recursion tree will look something like this:

    size n
       \
       size n-1
         \
         size n-2
           \
            ...
             \
              1

Now your recursion tree has height Θ(n), so if implemented without any tail call elimination quicksort will use Θ(n) space, one per active recursive call at each point.

like image 170
templatetypedef Avatar answered Oct 13 '22 14:10

templatetypedef