Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python memory consumption of objects and process

I wrote the following code:

from hurry.size import size
from pysize import get_zise
import os
import psutil
def load_objects():
   process = psutil.Process(os.getpid())
   print "start method"
   process = psutil.Process(os.getpid())
   print "process consumes " + size(process.memory_info().rss)
   objects = make_a_call()
   print "total size of objects is " + (get_size(objects))
   print "process consumes " + size(process.memory_info().rss)
   print "exit method"

def main():
    process = psutil.Process(os.getpid())
    print "process consumes " + size(process.memory_info().rss)
    load_objects()
    print "process consumes " + size(process.memory_info().rss)

get_size() returns the memory consumption of the objects using this code.

I get the following prints:

process consumes 21M
start method
total size of objects is 20M
process consumes 29M
exit method
process consumes 29M
  1. How come the objects consumed 20M if the process consumed only 8M more?
  2. If I exit a method shouldn't the memory decreased back to 21 as the garbage collector will clear the consumed memory?
like image 237
Dejell Avatar asked Dec 20 '17 14:12

Dejell


People also ask

How much memory does an object take in Python?

When you create a list object, the list object by itself takes 64 bytes of memory, and each item adds 8 bytes of memory to the size of the list because of references to other objects.

Is Python RAM Heavy?

Those numbers can easily fit in a 64-bit integer, so one would hope Python would store those million integers in no more than ~8MB: a million 8-byte objects. In fact, Python uses more like 35MB of RAM to store these numbers. Why? Because Python integers are objects, and objects have a lot of memory overhead.

How do I see RAM usage in Python?

The function psutil. virutal_memory() returns a named tuple about system memory usage. The third field in the tuple represents the percentage use of the memory(RAM). It is calculated by (total – available)/total * 100 .

How do you find the memory of an object in Python?

Use the sys. getsizeof() method to get the memory usage of an object, e.g. sys. getsizeof(['a', 'b', 'c']) . The method takes an object argument and returns the size of the object in bytes.


2 Answers

  1. Most likely this is because there's inaccuracy in your code.

Here's a fully working (python 2.7) example that has the same problem (I've slightly updated the original code for simplicity's sake)

from hurry.filesize import size
from pysize import get_size
import os
import psutil


def make_a_call():
    return range(1000000)

def load_objects():
    process = psutil.Process(os.getpid())
    print "start method"
    process = psutil.Process(os.getpid())
    print"process consumes ", size(process.memory_info().rss)
    objects = make_a_call()
    # FIXME
    print "total size of objects is ", size(get_size(objects))
    print "process consumes ", size(process.memory_info().rss)
    print "exit method"

def main():
    process = psutil.Process(os.getpid())
    print "process consumes " + size(process.memory_info().rss)
    load_objects()
    print "process consumes " + size(process.memory_info().rss)


main()

Here's the output:

process consumes 7M
start method
process consumes  7M
total size of objects is  30M
process consumes  124M
exit method
process consumes 124M

The difference is ~100Mb

And here's the fixed version of the code:

from hurry.filesize import size
from pysize import get_size
import os
import psutil


def make_a_call():
    return range(1000000)

def load_objects():
    process = psutil.Process(os.getpid())
    print "start method"
    process = psutil.Process(os.getpid())
    print"process consumes ", size(process.memory_info().rss)
    objects = make_a_call()
    print "process consumes ", size(process.memory_info().rss)
    print "total size of objects is ", size(get_size(objects))
    print "exit method"

def main():
    process = psutil.Process(os.getpid())
    print "process consumes " + size(process.memory_info().rss)
    load_objects()
    print "process consumes " + size(process.memory_info().rss)


main()

And here is the updated output:

process consumes 7M
start method
process consumes  7M
process consumes  38M
total size of objects is  30M
exit method
process consumes 124M

Did you spot the difference? You're calculating object sizes before measuring the final process size and it leads to additional memory consumption. Lets check why it might be happening - here's the sources of https://github.com/bosswissam/pysize/blob/master/pysize.py:

import sys
import inspect

def get_size(obj, seen=None):
    """Recursively finds size of objects in bytes"""
    size = sys.getsizeof(obj)
    if seen is None:
        seen = set()
    obj_id = id(obj)
    if obj_id in seen:
        return 0
    # Important mark as seen *before* entering recursion to gracefully handle
    # self-referential objects
    seen.add(obj_id)
    if hasattr(obj, '__dict__'):
        for cls in obj.__class__.__mro__:
            if '__dict__' in cls.__dict__:
                d = cls.__dict__['__dict__']
                if inspect.isgetsetdescriptor(d) or inspect.ismemberdescriptor(d):
                    size += get_size(obj.__dict__, seen)
                break
    if isinstance(obj, dict):
        size += sum((get_size(v, seen) for v in obj.values()))
        size += sum((get_size(k, seen) for k in obj.keys()))
    elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
        size += sum((get_size(i, seen) for i in obj))
    return size

Lots of things are happening here! The most notable one is that it holds all the objects it has seen in a set to resolve circular references. If you remove that line it won't each that much memory in either case.

  1. First of all, this behavior heavily depends on whether you use CPython or something else. As of CPython, this may happen because it's not always possible to give memory back to the OS immediately.

Here's a good article on the subject, quoting:

If you create a large object and delete it again, Python has probably released the memory, but the memory allocators involved don’t necessarily return the memory to the operating system, so it may look as if the Python process uses a lot more virtual memory than it actually uses.

like image 150
ffeast Avatar answered Sep 27 '22 19:09

ffeast


  1. Why would the process need to consume an overhead greater than 8M?
  2. Garbage collection does not necessarily happen immediately. See the documentation:

Objects are never explicitly destroyed; however, when they become unreachable they may be garbage-collected. An implementation is allowed to postpone garbage collection or omit it altogether — it is a matter of implementation quality how garbage collection is implemented, as long as no objects are collected that are still reachable.

CPython implementation detail: CPython currently uses a reference-counting scheme with (optional) delayed detection of cyclically linked garbage, which collects most objects as soon as they become unreachable, but is not guaranteed to collect garbage containing circular references. See the documentation of the gc module for information on controlling the collection of cyclic garbage. Other implementations act differently and CPython may change. Do not depend on immediate finalization of objects when they become unreachable (so you should always close files explicitly).

like image 36
ubadub Avatar answered Sep 27 '22 18:09

ubadub