I am trying to understand Space Complexities of different sorting algorithms.
http://bigocheatsheet.com/?goback=.gde_98713_member_241501229
from the above link I found that the complexity of
bubble sort,insertion and selection sort is O(1)
where as quick sort is O(log(n)) and merge sort is O(n).
we were actually not allocating extra memory in any of the algorithms. Then why the space complexities are different when we are using the same array to sort them?
When you run code, memory is assigned in two ways:
Implicitly, as you set up function calls.
Explicitly, as you create chunks of memory.
Quicksort is a good example of implicit use of memory. While I'm doing a quicksort, I'm recursively calling myself O(n)
times in the worst case, O(log(n))
in the average case. Those recursive calls each take O(1)
space to keep track of, leading to a O(n)
worst case and O(log(n))
average case.
Mergesort is a good example of explicit use of memory. I take two blocks of sorted data, create a place to put the merge, and then merge from those two into that merge. Creating a place to put the merge is O(n)
data.
To get down to O(1)
memory you need to both not assign memory, AND not call yourself recursively. This is true of all of bubble, insertion and selection sorts.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With