Could anyone explain why insertion sort has a time complexity of Θ(n²)?
I'm fairly certain that I understand time complexity as a concept, but I don't really understand how to apply it to this sorting algorithm. Should I just look to mathematical proofs to find this answer?
On average each insertion must traverse half the currently sorted list while making one comparison per step. The list grows by one each time.
So starting with a list of length 1 and inserting the first item to get a list of length 2, we have average an traversal of .5 (0 or 1) places. The rest are 1.5 (0, 1, or 2 place), 2.5, 3.5, ... , n-.5 for a list of length n+1.
This is, by simple algebra, 1 + 2 + 3 + ... + n - n*.5 = (n(n+1) - n)/2 = n^2 / 2 = O(n^2)
Note that this is the average case. In the worst case the list must be fully traversed (you are always inserting the next-smallest item into the ascending list). Then you have 1 + 2 + ... n, which is still O(n^2).
In the best case you find the insertion point at the top element with one comparsion, so you have 1+1+1+ (n times) = O(n).
It only applies to arrays/lists - i.e. structures with O(n) time for insertions/deletions. It can be different for other data structures. For example, for skiplists it will be O(n * log(n)), because binary search is possible in O(log(n)) in skiplist, but insert/delete will be constant.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With