Assume that I have two lists named a and b of both size n, and I want to do the following slice setting operation with k < n
a[:k] = b[:k]
In the Python wiki's Time Complexity page it says that the complexity of slice setting is O(n+k) where k is the length of the slice. I just cannot understand why it is not just O(k) in the above situation.
I know that slicing returns a new list, so it is O(k), and I know that the list holds its data in a continuous way, so inserting an item in the middle would take O(n) time. But the above operation can easily be done in O(k) time. Am I missing something?
Furthermore, is there a documentation where I can find detailed information about such issues? Should I look into the CPython implementation?
Thanks.
O(n+k) is the average case, which includes having to grow or shrink the list to adjust for the number of elements inserted to replace the original slice.
Your case, where you replace the slice with an equal number of new elements, the implementation only takes O(k) steps. But given all possible combinations of number of elements inserted and deleted, the average case has to move the n remaining elements in the list up or down.
See the list_ass_slice
function for the exact implementation.
You're right, if you want to know the exact details it's best to use the source. The CPython implementation of setting a slice is in listobject.c.
If I read it correctly, it will...
That adds up to O(n+k).
Of course, your case is probably not that worst case: you're changing the last k elements of the list, so there might be no need for shifting at all, reducing the complexity to O(k) you expected. However, that is not true in general.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With