How do I test the STDOUT output of a Python script with a testing framework like doctest, unittest, nose, etc? For example, say running my script "todo.py --list" should return "take out the garbage".
I've read someone who separates out the STDOUT printing part of the script from the part that generates the output to be printed. I'm used to sprinkling print statements all around my shell scripts. Is this simply a TDD unfriendly habit I should break or is there a way to easily test for correct print output?
Python's own test suite does this quite a bit, and we use two main techniques:
Redirecting stdout (as others have suggested). We use a context manager for this:
import io
import sys
import contextlib
@contextlib.contextmanager
def captured_output(stream_name):
"""Run the 'with' statement body using a StringIO object in place of a
specific attribute on the sys module.
Example use (with 'stream_name=stdout'):
with captured_stdout() as s:
print("hello")
assert s.getvalue() == "hello"
"""
orig_stdout = getattr(sys, stream_name)
setattr(sys, stream_name, io.StringIO())
try:
yield getattr(sys, stream_name)
finally:
setattr(sys, stream_name, orig_stdout)
def captured_stdout():
return captured_output("stdout")
def captured_stderr():
return captured_output("stderr")
def captured_stdin():
return captured_output("stdin")
Using the subprocess
module. We use this when we specifically want to test handling of command line arguments. See http://hg.python.org/cpython/file/default/Lib/test/test_cmd_line_script.py for several examples.
I see two ways :
Redirect stdout during the unittest:
class YourTest(TestCase):
def setUp(self):
self.output = StringIO()
self.saved_stdout = sys.stdout
sys.stdout = self.output
def tearDown(self):
self.output.close()
sys.stdout = self.saved_stdout
def testYourScript(self):
yourscriptmodule.main()
assert self.output.getvalue() == "My expected ouput"
Use a logger for your outputs and listen to it in your test.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With