While going through Wikipedia's list of sorting algorithms I noticed that there's no stable comparison sort that has O(n*log(n))
(worst-case) time-complexity and O(1)
(worst-case) space-complexity. This surely looks like a theoretical boundary, but I couldn't find more information about it.
How would one proof this?
Note: I know about the lower limit of O(n*log(n))
worst-case time-complexity for comparison sorts.
Bucket sort – Best and average time complexity: n+k where k is the number of buckets.
In each iteration of the selection sort algorithm, we find the minimum values element for the unsorted part of the array and then move it to the sorted part. The space complexity of the selection sort algorithm is O ( 1 ) O(1) O(1) as we are not using any extra space rather than two variables namely min and i.
Several common sorting algorithms are stable by nature, such as Merge Sort, Timsort, Counting Sort, Insertion Sort, and Bubble Sort. Others such as Quicksort, Heapsort and Selection Sort are unstable.
Despite what that article says, in-place stable Merge Sort can be made O(n log n)
.
Here is a paper that explains two ways to implement it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With