What is wrong with my code that executinon in 3rd round of nfind (while loop), return MemoryError in line with CACHE[sha] = number?
On system there is enough memory, and on the every end of the while loop I clear allocated memory, but it returns error in the 3rd run through while loop.
If you run these code, in some cases on, I suppose it is necessary to change XRAN= 2**23 to greater, or smaller exponent (by one or two), to produce error.
Please help and suggestions.
from multiprocessing import Pool
from hashlib import sha256
from struct import pack
import gc
XRAN= 2**23
def compsha(number):
return number, sha256(pack("Q", number)).digest()
if __name__ == '__main__':
gc.enable()
nfind = 1
while (nfind > 0):
print(nfind)
CACHE = {}
pool = Pool()
for i, output in enumerate(pool.imap_unordered(compsha, xrange((nfind-1)*XRAN, nfind*XRAN), 2)):
number, sha = output
CACHE[sha] = number
pool.close()
pool.join()
if nfind != 0 :
nfind = nfind + 1
del CACHE
=======================================================
>>>
1
2
Traceback (most recent call last):
File "D:\Python27\free_pool.py", line 20, in <module>
CACHE[sha] = number
MemoryError
In addition to Ned's answer about storing way too much in a dictionary that you don't even use, is it possible that you are running on a 32-bit python interpreter and hitting a 4GB memory limit in your main process?
$ python -c "import sys; print sys.maxint" // 64-bit python
9223372036854775807
$ python-32 -c "import sys; print sys.maxint" // 32-bit
2147483647
On windows, a 32-bit process might be limited between 2-4GB
You're running out of memory because you're trying to store 2**23 elements in a dictionary. That uses a lot of memory, apparently more than you have! You say you have enough RAM, how did you determine how much you'd need?
You'll need to come up with a different algorithm.
Also, you don't seem to ever access CACHE, so why are you using it at all?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With