Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Recursion in sorting algorithms - always bad? [closed]

Mergesort, quicksort are probably most known nlogn sorting algorithms. Their explanation and c++ code examples in most cases contains recursion. But as far as I understand about recursion when it will be big amount of data we run big risk of stack overflow. So is it reasonable to ignore recursion explanation about sorting algorithms as such that can't be use in real life?

like image 517
ManInTheHood Avatar asked Jan 30 '13 14:01

ManInTheHood


People also ask

Is recursion good for sorting?

Recursive techniques can be utilized in sorting algorithms, allowing for the sorting of n elements in O(nlogn) time (compared with the O(n2) efficiency of bubble sort. Two such algorithms which will be examined here are Mergesort and Quicksort.

Why recursion is bad performance?

The simplicity of recursion comes at the cost of time and space efficiency. It is much slower than iteration due to the overhead of function calls and control shift from one function to another. It requires extra memory on the stack for each recursive call. This memory gets deallocated when function execution is over.

Which sorting technique is worst?

Definition of Bogosort. The universally-acclaimed worst sorting algorithm is Bogosort, sometimes called Monkey Sort or Random Sort, for reasons we'll see shortly. Bogosort develops from the idea that, in probability theory, if a certain phenomenon is possible, then it will eventually happen.

Why recursive algorithms are inefficient?

Recursive algorithms are often inefficient for small data, due to the overhead of repeated function calls and returns. For this reason efficient implementations of recursive algorithms often start with the recursive algorithm, but then switch to a different algorithm when the input becomes small.


1 Answers

But as far as I understand about recursion when it will be big amount of data we run big risk of stack overflow.

It depends on several things:

  • Tail calls are nearly always optimized these days, so you would never hit stack overflow with tail recursion, even if you recurse O(2^N) times (the algorithm would still be slow, but it wouldn't overflow the stack).
  • Most sorting algorithms recurse down Log2(N) times. This comes up to 40 levels per terabyte of data being sorted - not enough to overflow a stack of anything capable of holding a terabyte of data in its memory.

is it reasonable to ignore recursion explanation about sorting algorithms as such that can't be use in real life?

No, it is not reasonable to ignore these explanations: if an algorithm is logically recursive, the best explanation will also be recursive. Even if you implement the algorithm with a loop that uses a dynamically allocated stack to avoid stack overflows, the nature of the algorithm would remain recursive, so the best way to understand what's going on is to pretend that a recursive call is made.

like image 132
Sergey Kalinichenko Avatar answered Sep 20 '22 20:09

Sergey Kalinichenko