I'm thoroughly confused about the memory usage of a specific python script. I guess I don't really know how to profile the usage despite advice from several SO Questions/Answers.
My questions are: What's the difference between memory_profiler
and guppy.hpy
? Why is one telling me I'm using huge amounts of memory, and the other is telling me I'm not?
I'm working with pysam
, a library for accessing bioinformatics SAM/BAM files. My main script is running out of memory quickly when converting SAM (ASCII) to BAM (Binary) and manipulating the files in between.
I created a small test example to understand how much memory gets allocated at each step.
# test_pysam.py:
import pysam
#from guppy import hpy
TESTFILENAME = ('/projectnb/scv/yannpaul/MAR_CEJ082/' +
'test.sam')
#H = hpy()
@profile # for memory_profiler
def samopen(filename):
# H.setrelheap()
samf = pysam.Samfile(filename)
# print H.heap()
pass
if __name__ == "__main__":
samopen(TESTFILENAME)
Monitoring the memory usage with memory_profiler (python -m memory_profiler test_pysam.py
) results in the following output:
Filename: test_pysam.py
Line # Mem usage Increment Line Contents
================================================
10 @profile # for memory_profiler
11 def samopen(filename):
12 10.48 MB 0.00 MB # print H.setrelheap()
13 539.51 MB 529.03 MB samf = pysam.Samfile(filename)
14 # print H.heap()
15 539.51 MB 0.00 MB pass
Then commenting out @profile
decorator and uncommenting the guppy
related lines, I get the following output (python test_pysam.py
):
Partition of a set of 3 objects. Total size = 624 bytes.
Index Count % Size % Cumulative % Kind (class / dict of class)
0 1 33 448 72 448 72 types.FrameType
1 1 33 88 14 536 86 __builtin__.weakref
2 1 33 88 14 624 100 csamtools.Samfile
The total size of line 13 is 529.03 MB in one case and 624 bytes in the other. What's actually going on here? 'test.sam' is a ~52MB SAM (again an ASCII format) file. It's a bit tricky for me to dig deep into pysam
, as it's a wrapper around a C library related to samtools
. Regardless of what a Samfile
actually is, I think I should be able to learn how much memory is allocated to create it. What procedure should I use to correctly profile the memory usage of each step of my larger, more complex python program?
What's the difference between memory_profiler and guppy.hpy?
Do you understand the difference between your internal view of the heap and the OS's external view of your program? (For example, when the Python interpreter calls free
on 1MB, that doesn't immediately—or maybe even ever—return 1MB worth of pages to the OS, for multiple reasons.) If you do, then the answer is pretty easy: memory_profiler is asking the OS for your memory use; guppy is figuring it out internally from the heap structures.
Beyond that, memory_profiler has one feature guppy doesn't—automatically instrumenting your function to print a report after each line of code; it's is otherwise much simpler and easier but less flexible. If there's something you know you want to do and memory_profiler doesn't seem to do it, it probably can't; with guppy, maybe it can, so study the docs and the source.
Why is one telling me I'm using huge amounts of memory, and the other is telling me I'm not?
It's hard to be sure, but here are some guesses; the answer is likely to be a combination of more than one:
Maybe samtools uses mmap to map small enough files entirely into memory. This would increase your page usage by the size of the file, but not increase your heap usage at all.
Maybe samtools or pysam creates a lot of temporary objects that are quickly freed. You could have lots of fragmentation (only a couple live PyObjects on each page), or your system's malloc may have decided it should keep lots of nodes in its freelist because of the way you've been allocating, or it may not have returned pages to the OS yet, or the OS's VM may not have reclaimed pages that were returned. The exact reason is almost always impossible to guess; the simplest thing to do is to assume that freed memory is never returned.
What procedure should I use to correctly profile the memory usage of each step of my larger, more complex python program?
If you're asking about memory usage from the OS point of view, memory_profiler is doing exactly what you want. While major digging into pysam may be difficult, it should be trivial to wrap a few of the functions with the @profile
decorator. Then you'll know which C functions are responsible for memory; if you want to dig any deeper, you obviously have to profile at the C level (unless there's information in the samtools docs or from the samtools community).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With