I've applied a cgroups rule to a specific user, and I'd like to test whether memory of the programs running from the above user has been limited as expected. I tried with the following script:
import string
import random
if __name__ == '__main__':
d = {}
i = 0;
for i in range(0, 100000000):
val = ''.join(random.choice(string.ascii_uppercase + string.digits) for _ in range(200)) # generate ramdom string of size 200
d[i] = val
if i % 10000 == 0:
print i
When I monitored the process via ps
command, it turned out to be that the %MEM is increased to 4.8 and never changed when both cgroups service is on and off:
$ ps aux | grep mem_intensive.py
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
jason 11531 88.5 4.8 3312972 3191236 pts/0 R+ 22:16 0:07 python mem_intensive.py
In this scenario, total memory is 62GB, thus 4.8% of it about 3GB. I set the limit to be 4GB without any other processes running on this user.
So could anyone give me some idea about this problematic python script? Thanks in advance.
Those numbers can easily fit in a 64-bit integer, so one would hope Python would store those million integers in no more than ~8MB: a million 8-byte objects. In fact, Python uses more like 35MB of RAM to store these numbers. Why? Because Python integers are objects, and objects have a lot of memory overhead.
Python doesn't limit memory usage on your program. It will allocate as much memory as your program needs until your computer is out of memory. The most you can do is reduce the limit to a fixed upper cap. That can be done with the resource module, but it isn't what you're looking for.
Working with Python Memory Profiler You can use it by putting the @profile decorator around any function or method and running python -m memory_profiler myscript. You'll see line-by-line memory usage once your script exits.
I've played a bit with your script, and it keeps growing, albeit slowly. The bottleneck is using random.choice
. If you want to fill memory fast, generating randomness works against you. So just using fixed strings does exhaust the memory rather quickly. If using the following, while wanting to watch how it grows you'd probably throw a time.sleep()
after your print
:
if __name__ == '__main__':
d = {}
i = 0;
for i in range(0, 100000000):
d[i] = 'A'*1024
if i % 10000 == 0:
print(i)
filling memory faster:
just an one-liner:
['A'*1024 for _ in xrange(0, 1024*1024*1024)]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With