I'm putting around 4 million different keys into a Python dictionary. Creating this dictionary takes about 15 minutes and consumes about 4GB of memory on my machine. After the dictionary is fully created, querying the dictionary is fast.
I suspect that dictionary creation is so resource consuming because the dictionary is very often rehashed (as it grows enormously). Is is possible to create a dictionary in Python with some initial size or bucket number?
My dictionary points from a number to an object.
class MyObject: def __init__(self): # some fields... d = {} d[i] = MyObject() # 4M times on different key...
With performance issues it's always best to measure. Here are some timings:
d = {} for i in xrange(4000000): d[i] = None # 722ms d = dict(itertools.izip(xrange(4000000), itertools.repeat(None))) # 634ms dict.fromkeys(xrange(4000000)) # 558ms s = set(xrange(4000000)) dict.fromkeys(s) # Not including set construction 353ms
The last option doesn't do any resizing, it just copies the hashes from the set and increments references. As you can see, the resizing isn't taking a lot of time. It's probably your object creation that is slow.
I tried :
a = dict.fromkeys((range(4000000)))
It creates a dictionary with 4 000 000 entries in about 3 seconds. After that, setting values are really fast. So I guess dict.fromkey is definitly the way to go.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With