Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do we need so many sorting techniques?

There is a plethora of sorting techniques in data structure as follows -
Selection Sort
Bubble Sort
Recursive Bubble Sort
Insertion Sort
Recursive Insertion Sort
Merge Sort
Iterative Merge Sort
Quick Sort
Iterative Quick Sort
Heap Sort
Counting Sort
Radix Sort
Bucket Sort
Shell Sort
Tim Sort
Comb Sort
Pigeonhole Sort
Cycle Sort
Cocktail Sort
Strand Sort
and many more.
Do we need all of them?

like image 852
HariomJoshi Avatar asked Sep 11 '25 14:09

HariomJoshi


2 Answers

There’s no single reason why so many different sorting algorithms exist. Here’s a sampler of sorting algorithms and where they came from, to give a better sense of their origins:

  • Radix sort was invented in the late 1800s for physically sorting punched cards for the US census. It’s still used today in software because it’s very fast on numeric and string data.
  • Merge sort appears to have been invented by John von Neumann to validate his stored-program computer model (the von Neumann architecture). It works well as a sorting algorithm for low-memory computers processing data that’s streamed through the machine, hence its popularity in the 1960s and 1970s. And it’s a great testbed for divide-and-conquer techniques, making it popular in algorithms classes.
  • Insertion sort seems to have been around forever. Even though it’s slow in the worst case, it’s fantastic on small inputs and mostly-sorted data and is used as a building block in other fast sorting algorithms.
  • Quicksort was invented in 1961. It plays excellently with processor caches, hence its continued popularity.
  • Sorting networks were studied extensively many years back. They’re still useful as building blocks in theoretical proof-of-concept algorithms like signature sort.
  • Timsort was invented for Python and was designed to sort practical, real-world sequences faster than other sorts by taking advantage of common distributions and patterns.
  • Introsort was invented as a practical way to harness the speed of quicksort without its worst-case behavior.
  • Shellsort was invented over fifty years ago and was practical on the computers of its age. Probing its theoretical limits was a difficult mathematical problem for folks who studied it back then.
  • Thorup and Yao’s O(n sqrt(log log n))-time integer sorting algorithm was designed to probe the theoretical limits of efficient algorithms using word-level parallelism.
  • Cycle sort derives from the study of permutations in group theory and is designed to minimize the number of memory writes made when sorting the list.
  • Heapsort is noteworthy for being in-place and yet fast in practice. It’s based on the idea of implicitly representing a nontrivial data structure.

This isn’t even close to an exhaustive list of sorting algorithms, but hopefully gives you a sense of what’s out there and why. :-)

like image 147
templatetypedef Avatar answered Sep 13 '25 03:09

templatetypedef


The main reason sorting algorithms are discussed and studied in early computer science classes if because they provide very good studying material. The problem of sorting is simple, and a good excuse to present several algorithm strategies; several data structures; how to implement them; and discuss time complexity and space complexity; and different properties algorithms can have even if they apparently solve the same problem.

In practice, standard libraries for programming languages usually include a default sort function, such as std::sort in C++ or list.sort in python; and in almost every situation, you should trust that function and the algorithm it uses.

But everything you've learned about sorting algorithms is valuable and can be applied to other problems. Here is a non-exhaustive list of things that can be learned by studying sorting algorithms:

  • divide and conquer;
  • heaps;
  • binary search trees, including different types of self-balancing binary search trees;
  • the importance of choosing an appropriate data-structure;
  • difference between in-place and not-in-place;
  • difference between stable and non-stable sort;
  • recursive approach and iterative approach;
  • how to calculate the time complexity, and how to compare the efficiency of two algorithms;
like image 38
Stef Avatar answered Sep 13 '25 05:09

Stef