Does anyone have a list of rough rule-of-thumb estimators for the various data structures? e.g.
- Arrays
- Lists
- HashMaps
- LinkedLists
I remember seeing some of these estimates thrown around in various places, but I can't seem to find one right now.
I know it's actually incredibly complicated, especially for things like HashMaps, but I'm looking for something really rough, like:
Memory(HashMap) = fixedOverhead + variableOverhead * tableSize + A*numKeys + B*numValues + Memory(allKeys) + Memory(allValues)
of course it'll vary a lot depending on this and that, but even a rough within-a-factor-of-2 estimate would be immensely useful.
Check this out. From Java code to Java heap-Understanding and optimizing your application's memory usage
This table is quite exhaustive, and deals precisely with the JDK implementation choices measured in bytes per entry/element. If you want to do it on your own machine -- if you're running on a different machine, perhaps -- this Google Code site will let you download its source. http://code.google.com/p/memory-measurer/wiki/ElementCostInDataStructures
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With