I was wondering why does the c++ standard require that std::sort
should only take random-access iterators? I don't see the advantage, since both std::sort and std::list::sort have a complexity of N*log(N)
. Restricting std::sort
to random-access iterators (RAI) seems to have made it necessary to write a separate function for lists with the same complexity.
The same applies to partial_sort
, where the non-RAI counter-part for list is simply missing to this day.
Is this design because people used variants of quick_sort
to implement std::sort
historically?
If there is advantage for writing sort algorithms on RAI containers, would it better to make std::sort
more general, and let RAI containers like std::vector
provide specialized v.sort
?
Random-access iterators are iterators that can be used to access elements at an arbitrary offset position relative to the element they point to, offering the same functionality as pointers. Random-access iterators are the most complete iterators in terms of functionality.
The std::sort is a sorting function that uses the Introsort algorithm and have the complexity of O(N log(N)) where N= std::distance(first, last) since C++11 and the order of equal elements is not guaranteed to be preserved[3]. The gcc-libstdc++ also uses Introsort algorithm.
stable_sort() in C++ STL. stable_sort() is used to sort the elements in the range [first, last) in ascending order. It is like std::sort, but stable_sort() keeps the relative order of elements with equivalent values.
The vector can use the array notation to access the elements. If you don't want to use the default std::sort , or std::sort with custom comparator, you can use qsort or write your own.
O(N*log(N))
complexity doesn't imply that the container is iterated in order, nor that the changes to it are only made in scan order. To use sequential iterators would come at a O(N)
memory cost to store all of those iterators.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With