I'm using portable python 2.7.5.1.
The following line:
x = [{} for i in range(11464882)]
Causes a memory error (with no other messages):
>>>
Traceback (most recent call last):
File "<module1>", line 12, in <module>
MemoryError
>>>
Note: there are only comments in lines 1-11.
Decreasing one unit in the funny number above the error disappears.
Considering that 11 million ain't that large, I believe there must be some simple setting that can increase the amount of memory available for my programs.
So, is this something simple I'm missing or an inherent memory limit?
On my Mac OS X 10.8.5 64-bit laptop, a range(11464882)
object requires:
>>> import sys
>>> sys.getsizeof(range(11464882))
91719128
>>> sys.getsizeof(11464881) # largest number
24
>>> sys.getsizeof(0) # smalest number
24
>>> 91719128 + (24 * 11464882) # bytes
366876296
>>> (91719128 + (24 * 11464882)) / (1024.0 ** 2) # mb
349.88050079345703
so 350 megabytes of memory.
Here, sys.getsizeof()
returns the memory footprint of the given Python objects not counting contained values. So, for a list, it is just the memory a list structure requires; bookkeeping information plus 11 million 64-bit pointers.
In addition, that many empty dictionaries take:
>>> sys.getsizeof({})
280
>>> 91719128 + 280 * 11464882
3301886088
>>> (91719128 + 280 * 11464882) / (1024.0 ** 2) # mb
3148.923957824707
>>> (91719128 + 280 * 11464882) / (1024.0 ** 3) # gb
3.0751210525631905
3 gigabytes of memory. 11 million times 280 bytes is a lot space.
Together with other overhead (most likely garbage collection cycle detection, the Python process itself and memoized values), that means you hit your machine's 4GB per-process memory limit.
If you are running a 32-bit binary, sizes will be smaller as you'd only need room for 32-bit pointers, but you also only get 2GB of addressable memory to fit all your objects into.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With