I've found that there is a lot of controversy about asymptotic complexity of List.Add()
. The source of it I suspect is the worst case scenario that causes underlying array to resize and would logically be O(n)
operation. However, the array grows twice in size each time list runs out of space. That makes amount of resizes needed for n
elements be proportional to log(n)
.
Does not that mean that asymptotic complexity of Add
operation in average case will be O(n/log(n))
?
The real benchmark for List.Add()
is below. However, benchmarks are not really expressive for such operation - we might be running out of memory prior to any deviation from straight (in logarithmic scale) line becomes visible.
This means that the amortized complexity of List.Add()
can be calculated by summing the resize operations and then multiplying by number of total additions made to the list.
T(n) = (2 + 4 + 8 + ... + n/2 + n) / n
But note that the summation is a geometric series, and we can do better than assuming it's (summation) n*log(n)
:
T(n) < 2n/n = 2 -> T(n) is in O(1)
Note: Here I am assuming you mean add()
as appending. Inserting an element in an arbitrary location takes O(n)
time, and you will have to account for that as well, which will change the end result from O(1)
amortized complexity to O(n)
amortized complexity.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With