Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python - Log memory usage

Tags:

python

slurm

Is there a way in python 3 to log the memory (ram) usage, while some program is running?

Some background info. I run simulations on a hpc cluster using slurm, where I have to reserve some memory before submitting a job. I know that my job require a lot of memory, but I am not sure how much. So I was wondering if there is a simple solution for logging the memory over time.

like image 447
physicsGuy Avatar asked Nov 21 '17 10:11

physicsGuy


People also ask

How do I check Python memory usage?

You can use it by putting the @profile decorator around any function or method and running python -m memory_profiler myscript. You'll see line-by-line memory usage once your script exits.

What is memory usage in Python?

Mem usage: Memory usage by the Python interpreter after every execution of the line. Increment: Difference in memory consumption from the current line to the last line. It basically denotes the memory consumed by a particular line of Python code. Occurrences: Number of times a particular line of code is executed.

How do I use Tracemalloc in Python?

To trace most memory blocks allocated by Python, the module should be started as early as possible by setting the PYTHONTRACEMALLOC environment variable to 1 , or by using -X tracemalloc command line option. The tracemalloc. start() function can be called at runtime to start tracing Python memory allocations.

What is memory profiler in Python?

This is a python module for monitoring memory consumption of a process as well as line-by-line analysis of memory consumption for python programs. It is a pure python module which depends on the psutil module.


2 Answers

You can do that with the memory_profiler package. Just with adding a decorator @profile to a function, you will get an output like this:

Line #    Mem usage  Increment   Line Contents
==============================================
 3                           @profile
 4      5.97 MB    0.00 MB   def my_func():
 5     13.61 MB    7.64 MB       a = [1] * (10 ** 6)
 6    166.20 MB  152.59 MB       b = [2] * (2 * 10 ** 7)
 7     13.61 MB -152.59 MB       del b
 8     13.61 MB    0.00 MB       return a

Otherwise, the easiest way to do it is to ask Slurm afterwards with the sacct -l -j <JobId> command (look for the MaxRSS column) so that you can adapt for further jobs.

Also, you can use the top command while running the program to get an idea of its memory consumption. Look for the RES column.

like image 180
damienfrancois Avatar answered Oct 18 '22 04:10

damienfrancois


You can use subprocess module. Here is a sample output of bash command free

$ free -m
             total       used       free     shared    buffers     cached
Mem:          7979       7678        300          0        109       4628
-/+ buffers/cache:       2941       5038
Swap:         2046        360       1686

Python program -

import subprocess
result = subprocess.check_output(['bash','-c', 'free -m'])
free_memory = result.split('\n')[1].split()[3]
# print free_memory
# 300

If you want to check memory usage of some process or periodically log it, then you can use pmap or some other utility depending on your use case and then parse the output.

like image 30
Jay Avatar answered Oct 18 '22 05:10

Jay