UPDATE:- This problem solved itself after a machine reboot. Not yet able to figure out why this error was happening before.
I have a function that loads a huge numpy array (~ 980MB) and returns it.
When I first start Ipython and call this function, it loads the array into the variable without any problem.
But if I run the same command again, it exits raising a "Memory Error".
I tried the following,
del hugeArray
Still the same error was occurring. I even tried the following
del hugeArray
gc.collect()
gc.collect()
Initially, gc.collect()
returned 145 and the second call returned 48.
But even after this when I call the function, it was still raising a Memory error.
The only way I could load again was to restart ipython. Is there something I can do to free all memory in ipython, so that I don't have to restart it?
----------------Update
Following is the output of %whos
Variable Type Data/Info
------------------------------
gc module <module 'gc' (built-in)>
gr module <module 'Generate4mRamp' <...>rom 'Generate4mRamp.pyc'>
np module <module 'numpy' from '/us<...>ages/numpy/__init__.pyc'>
plt module <module 'matplotlib.pyplo<...>s/matplotlib/pyplot.pyc'>
Out of this, gr is my module containing the function which i used to load the data cube.
---------How to Reproduce the error
The following simple function is able to reproduce the error.
import numpy as np
import gc
def functionH():
cube=np.zeros((200,1024,1024))
return cube
testcube=functionH() #Runs without any issue
del testcube
testcube=functionH() # Raises Memory Error
del testcube
gc.collect()
gc.collect()
testcube=functionH() # Still Raises Memory Error
This error is occurring only in Ipython. In simple python (>>>) after giving del testcube
, there is no Memory Error.
You can save your NumPy arrays to CSV files using the savetxt() function. This function takes a filename and array as arguments and saves the array into CSV format. You must also specify the delimiter; this is the character used to separate each variable in the file, most commonly a comma.
1. NumPy uses much less memory to store data. The NumPy arrays takes significantly less amount of memory as compared to python lists. It also provides a mechanism of specifying the data types of the contents, which allows further optimisation of the code.
As the array size increase, Numpy gets around 30 times faster than Python List. Because the Numpy array is densely packed in memory due to its homogeneous type, it also frees the memory faster.
Are you looking at the value? IPython caches output variables as e.g. Out[8]
, so if you examine it, it will be kept in memory.
You can do %xdel testcube
to delete the variable and remove it from IPython's cache. Alternatively, %reset out
or %reset array
will clear either all your output history, or only references to numpy arrays.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With