Is it theoretically possible to have a data-structure that has
O(1) access, insertion, deletion times
and dynamic length?
I'm guessing one hasn't yet been invented or we would entirely forego the use of arrays and linked lists (seperately) and instead opt to use one of these.
Is there a proof this cannot happen, and therefore some relationship between access-time, insertion-time and deletion-time (like conservation of energy) that suggests if one of the times becomes constant the other has to be linear or something along that.
No such data structure exists on current architectures.
Informal reasoning:
O(n)
time for insertion/deletion, you need a tree data structure of some sortO(1)
random access, you can't afford to traverse a treeThe best you can do is get O(log n)
for all these operations. That's a fairly good compromise, and there are plenty of data structures that achieve this (e.g. a Skip List).
You can also get "close to O(1)
" by using trees with high branching factors. For example, Clojure's persistent data structure use 32-way trees, which gives you O(log32 n)
operations. For practical purposes, that's fairly close to O(1)
(i.e. for realistic sizes of n
that you are likely to encounter in real-world collections)
If you are willing to settle for amortized constant time, it is called a hash table.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With