Why ArrayList add() and add(int index, E) complexity is amortized constant time?
Why not O(1) for single add() operation, O(n) for single add(int index, E) operation and O(n) for adding n elements (n add operations) using either (any) add method? Supposing we are using add(int index, E) to add to array end rarely?
Isn't one operation complexity of array (and ArrayList) already having n elements:
If we make a million insertions, the average of 1 and 2 cannot be O(1), right?
Why Oracle says
The add operation runs in amortized constant time, that is, adding n elements requires O(n) time.
I thought complexity is O(1) for add() and O(n) for add(int index, E).
Does it means that "integral complexity of n operations" (each operation of complexity O(1)) is so to speak n*O(1) = O(n). What am I missing?
Maybe for Oracle "add operation" always means only add()? And add(int, E) is "insertion operation"? Then totally clear!
I have a guess, it has to do with difference between asymptotic analysis and amortized analysis, but I cannot grasp it to the end.
What is amortized analysis of algorithms?
Why the time complexity of an array insertion is O(n) and not O(n+1)?
More appropriate to say Amortized O(1) vs O(n) for insertion into unsorted dynamic array? (not quite clear)
Big O when adding together different routines
The add operation runs in amortized constant time, that is, adding n elements requires O(n) time. All of the other operations run in linear time (roughly speaking). The constant factor is low compared to that for the LinkedList implementation. Each ArrayList instance has a capacity.
Worst Case - O(N) In general, if we have n elements we need to shift all n elements. So, worst case time complexity will be O(n).
Basically, the average time is constant, but each individual operation may take more time - in this case when the underlying array has to resize.
In Oracle terms (which are implied tacitly) and speaking about List
boolean add(E)
boolean add(int index, E)
When Oracle writes
The add operation runs in amortized constant time, that is, adding n elements requires O(n) time.
it means:
Complexity of a single boolean add(E)
operation is amortized O(1).
It cannot be just O(1) asymptotic (always) because rarely we need to grow array capacity. This single add operation which is in fact a "create new bigger array, copy old array into it, and then add one element to the end" operation is O(n) asymptotic complexity, because copying the array when increasing List capacity is O(n), complexity of growing plus addding is O(n) [calculated as O(n) + O(1) = O(n)]. Without this capacity growing operation, add complexity would have been O(1), element is always added (appended) to the end of array (max index). If we "added" (= inserted) not to array end, we would need to move rightmost elements (with bigger indexes), and complexity for single such operation would have been O(n).
Now, for a single add operation asymptotic complexity is O(1) for add-without-increasing-capacity and O(n) for add-with-increasing-capacity (which happens very rare).
Amortized complexity of a single add operation is O(1). It reflects the fact that rare O(n) grow-and-add operations get "diluted" with much more numerous O(1) add-without-grow ones, so "on the average" single operation is O(1).
"Asymptotic complexity" of n add operations is O(n). But here we speak about complexity of n operations, not complexity of one operation. It is not a strict way to put it like this ("Asymptotic complexity"), but anyway. Amortized complexity of n operations is even less sense.
Finally, boolean add(int index, E)
single operation complexity is always O(n). If it triggers grows, it is O(n) + O(n) [grow + insert], but 2*O(n) is same as O(n).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With