Context: I'm developing for AWS Lambda. There you have time and memory limitations (source). I want to check in my unit tests if I might break those.
I have seen pytest-timeout for limiting the time of a test in this question and I will use it for the time restrictions.
Is there something similar for memory?
Something like
@pytest.mark.max_memory_kb(128000)
def test_foo():
pass
Thos are work-arounds I found.
from unittest import TestCase
import sys
class AgentAPITests(TestCase):
def test_foo(self):
return_value = foo()
size = sys.getsizeof(return_value)
max_bytes = 1337
self.assertLess(size, max_bytes)
Drawback: It does not catch what is happening in between. For this reason I will not accept this answer.
This is not a unit test, so I will not accept this either.
I found something similar here using memory_profiler:
#!/usr/bin/env python
# core modules
from memory_profiler import profile
# internal modules
import foo
precision = 10
fp = open('memory_profiler_basic_mean.log', 'w+')
@profile(precision=precision, stream=fp)
def test():
return_val = foo.bar()
print(return_val)
test()
which creates such a log file:
Filename: foobar.py
Line # Mem usage Increment Line Contents
================================================
14 51.6640625000 MiB 51.6640625000 MiB @profile(precision=precision, stream=fp)
15 def test():
16 52.2968750000 MiB 0.6328125000 MiB return_val = foo.bar()
17 52.2968750000 MiB 0.0000000000 MiB print(return_val)
Hence I can see that the function needs 51.6 MiB. If I could use that in a unit tests, my problem would be solved.
Here are questions going in this manual, non-unit test direction:
guppy seems to be usually used.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With