I am slightly confused when I use the getsizeof
method in the sys
module for dictionaries. Below I have created a simple dictionary of two strings. The two strings' sizes are clearly larger than the one of the dictionary. The dictionary size is probably the dictionary overhead only, i.e., it doesn't take the actual data into account. What is the best way to figure out the memory-usage of the whole dictionary (keys, values, dictionary overhead)?
>>> first = 'abc'*1000 >>> second = 'def'*1000 >>> my_dictionary = {'first': first, 'second': second} >>> getsizeof(first) 3021 >>> getsizeof(second) 3021 >>> getsizeof(my_dictionary) 140
From the PythonDocs
See recursive sizeof recipe for an example of using getsizeof() recursively to find the size of containers and all their contents.
So it only counts the overhead, but you can use the function in this link to calculate it for containers like dicts.
The recursive getsizeof
would get the actual size, but if you have multiple layers of dictionaries and only want to get a rough estimate. The json
comes handy.
>>> first = 'abc'*1000 >>> second = 'def'*1000 >>> my_dictionary = {'first': first, 'second': second} >>> getsizeof(first) 3049 >>> getsizeof(second) 3049 >>> getsizeof(my_dictionary) 288 >>> getsizeof(json.dumps(my_dictionary)) 6076 >>> size = getsizeof(my_dictionary) >>> size += sum(map(getsizeof, my_dictionary.values())) + sum(map(getsizeof, my_dictionary.keys())) >>> size 6495
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With