I know there are a ton of numpy memory error topics, so I hope I haven't duplicated anything. I'm trying to create a np array using np.zeros((500000,10000))
. This works fine on my Mac with 16G of memory, but on a Linux server with 28G of RAM it fails instantly with Memory Error
. I've verified that I'm running the 64 bit version of Ubuntu and Python, and I'm on Numpy 1.9.3. The only difference I noticed between systems (apart from the obvious) is that when running ulimit -a
I get:
Linux: max locked memory (kbytes, -l) 64
Mac: max locked memory (kbytes, -l) unlimited
Could this be the reason I can't run this command? If not, is there some other configuration option I'm missing?
My best guess are:
I base my first guess in the fact that in 64 bit your array will take 500000*10000*8= 40GB of RAM 20GB in 32 bit, and therefore the array does not fit in the memory you have. There may be a swap to account for the missing memory.
I base my second guess in this link, where it is explained that np.zeros will not allocate actually in memory the zeros until that memory is accessed for the first time. I have tested in my linux (Ubuntu) computer that np.zeros works with increasing arrays until I reach my RAM limit. Then I get a memory error even if it does not actually allocate the memory.
Once you create the matrix (increase the size enough to make it clear the memory usage):
a = np.zeros((50,10))
You can check the actual memory required by storing a zero in each cell of the matrix:
a[:,:] = 0.0
Or forcing an operation so the memory is accessed and therefore allocated:
a = a + a
Keep track of the memory usage of the computer while performing this check to understand when the memory is allocated.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With