It could very well be that the answer to this question is an obvious and resounding "there's no such thing", but I'll give it a shot: Is there a functional map-like data structure that is more efficient than a standard map when the keys have an arbitrary, often very big, size?
For the sake of concreteness, consider the Haskell type
(Ord k) => Map [k] v
in which lookups can take a very long time if lists need to be compared down to a deep level. I guess hashing is also out of the question due to the arbitrary length of the lists. I still can't help but think there could be a clever data structure out there.
Is hashing out of the question? There's no prefix of the key structure that can be computed efficiently?
If not, how about a hashmap? Take the very big key, reduce it to something very small, use that as an index into a structure.
A trie?
If you have two long keys that are almost identical, a Map will compare them both from the beginning, but a trie will only compare the suffixes that haven't already been eliminated by previous comparisons (if you see what I mean). So a trie would be more time-efficient in that situation.
Tries can be optimised in various ways, and you might also want to look at ternary trees.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With