I recently came across this article about python memory allocation.
In this page it describes the memory usage of python and in there there is an example showing deepcopy of list of integers. I did the benchmark myself on Python 2.7
Line # Mem usage Increment Line Contents
================================================
4 28.051 MiB 0.000 MiB @profile
5 def function():
6 59.098 MiB 31.047 MiB x = list(range(1000000)) # allocate a big list
7 107.273 MiB 48.176 MiB y = copy.deepcopy(x)
8 99.641 MiB -7.633 MiB del x
9 99.641 MiB 0.000 MiB return y
so delete x directly only removes x and all the references to integer to x right?
Doing this could not help either (So what is the difference del x and del x[:]?):
Line # Mem usage Increment Line Contents
================================================
4 28.047 MiB 0.000 MiB @profile
5 def function():
6 59.094 MiB 31.047 MiB x = list(range(1000000)) # allocate a big list
7 107.270 MiB 48.176 MiB y = copy.deepcopy(x)
8 99.637 MiB -7.633 MiB del x[:]
9 99.637 MiB 0.000 MiB return y
And in contrast to deepcopy, if I use copy, after deletion seems the memory restores to previous state when x is newly created
Line # Mem usage Increment Line Contents
================================================
4 28.039 MiB 0.000 MiB @profile
5 def function():
6 59.090 MiB 31.051 MiB x = list(range(1000000)) # allocate a big list
7 66.895 MiB 7.805 MiB y = copy.copy(x)
8 59.262 MiB -7.633 MiB del x[:]
9 59.262 MiB 0.000 MiB return y
For dict:
Line # Mem usage Increment Line Contents
================================================
4 28.051 MiB 0.000 MiB @profile
5 def function():
6 100.523 MiB 72.473 MiB x = dict((e, e) for e in xrange(1000000))
7 183.398 MiB 82.875 MiB y = copy.deepcopy(x)
8 135.395 MiB -48.004 MiB del x
9 135.395 MiB 0.000 MiB return y
And for list of lists (compare to list of integers, I assume that del x or del x[:] only removes that huge array list on heap?):
Line # Mem usage Increment Line Contents
================================================
4 28.043 MiB 0.000 MiB @profile
5 def function():
6 107.691 MiB 79.648 MiB x = [[] for _ in xrange(1000000)]
7 222.312 MiB 114.621 MiB y = copy.deepcopy(x)
8 214.680 MiB -7.633 MiB del x[:]
9 214.680 MiB 0.000 MiB return y
So I want to ask:
And how do I or if there is a way to free all the underlining lists in x in this example?
Line # Mem usage Increment Line Contents
================================================
4 28.047 MiB 0.000 MiB @profile
5 def function():
6 248.008 MiB 219.961 MiB x = [list(range(10)) for _ in xrange(1000000)]
7 502.195 MiB 254.188 MiB y = copy.deepcopy(x)
8 494.562 MiB -7.633 MiB del x[:]
9 494.562 MiB 0.000 MiB return y
As explained earlier, Python deletes objects that are no longer referenced in the program to free up memory space. This process in which Python frees blocks of memory that are no longer used is called Garbage Collection.
Garbage Collection in PythonTo clear memory, you have to ensure that you don't keep storing the references to the objects. This will ensure that the memory gets cleared when they are garbage-collected. Garbage collection is carried out by a program to clear the previous memory for an object that is not being used.
Python uses a garbage collection algorithm (called Garbage Collector) that keeps the Heap memory clean and removes objects that are not needed anymore.
Memory allocation can be defined as allocating a block of space in the computer memory to a program. In Python memory allocation and deallocation method is automatic as the Python developers created a garbage collector for Python so that the user does not have to do manual garbage collection.
del
does not free variables as in C, it simply says that you no longer need it. What then happens is an implementation detail.
So what is happening here is that del
does not free memory, it simply tells python that you are done with the variable. Specifically:
7.5. The del statement
del_stmt ::= “del” target_list
Deletion is recursively defined very similar to the way assignment is defined. Rather than spelling it out in full details, here are some hints.
Deletion of a target list recursively deletes each target, from left to right.
Deletion of a name removes the binding of that name from the local or global namespace, depending on whether the name occurs in a global statement in the same code block. If the name is unbound, a NameError exception will be raised.
Deletion of attribute references, subscriptions and slicings is passed to the primary object involved; deletion of a slicing is in general equivalent to assignment of an empty slice of the right type (but even this is determined by the sliced object).
Note that there is no mention of freeing memory. What instead happens is that you tell python that it can do "whatever it wants" with that memory. In this case your python implementation (which I assume is CPython) stores the memory for later use in a memory cache. This allows python to run faster by not needing to allocate as much memory later.
Consider this example, where we del x
and then create a copy of y
again. Note that the amount of memory allocated during the second copy is smaller than during the first. This is because memory is re-used. If we do this again, we see that hardly any memory at all is allocated during the third copy, because python is simply re-using previously allocated memory:
Line # Mem usage Increment Line Contents
================================================
4 34.777 MiB 0.000 MiB @profile
5 def function():
6 37.504 MiB 2.727 MiB x = [list(range(10)) for _ in xrange(10000)]
7 40.773 MiB 3.270 MiB y = copy.deepcopy(x)
8 40.773 MiB 0.000 MiB del x
9 41.820 MiB 1.047 MiB y2 = copy.deepcopy(y)
10 41.820 MiB 0.000 MiB del y2
11 41.824 MiB 0.004 MiB y3 = copy.deepcopy(y)
12 41.824 MiB 0.000 MiB return y
Excellent "blog": http://www.evanjones.ca/memoryallocator/
http://effbot.org/pyfaq/why-doesnt-python-release-the-memory-when-i-delete-a-large-object.htm
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With