I am wanting to unittest a component of my application. The code looks a little like below.
def read_content_generator(myfile):
for line in open(myfile):
# do some string manipulation.
yield result
The problem I am having is that I cannot mock the open()
functionality within a for
loop.
What I am aiming for is a unittest
like this: (I know this code is not right but its just an example of what I am trying to do):
def test_openiteration(self):
with mock.patch('open') as my_openmock:
my_openmock.return_value = ['1','2','3']
response = myfunction()
self.assertEquals([1,2,3], response)
New in version 3.3. unittest.mock is a library for testing in Python. It allows you to replace parts of your system under test with mock objects and make assertions about how they have been used. unittest.mock provides a core Mock class removing the need to create a host of stubs throughout your test suite.
mock_open is used to return test data. Depending upon Python version we mock built-in open. For example, in Python 2, it is called as __builtin__. open whereas in Python 3 it is called builtins.
You can mock open()
to return StringIO object.
mymodule.py:
def read_content_generator(myfile):
with open(myfile) as f:
for line in f:
yield '<{}>'.format(line)
Note that I've used with statement there.
test_mymodule.py:
import io
import unittest
import unittest.mock as mock
import mymodule
class Tests(unittest.TestCase):
def test_gen(self):
fake_file = io.StringIO('foo\nbar\n')
with mock.patch('mymodule.open', return_value=fake_file, create=True):
result = list(mymodule.read_content_generator('filename'))
self.assertEqual(result, ['<foo\n>' , '<bar\n>'])
Works for python3.4.
At first I tried to use mock.mock_open(read_data='1\n2\n3\n')
but iteration support seems to be broken.
There are two easy options available:
Change read_content_generator
to take a file, not a file name, and mock that with an io.StringIO
.
Make a temporary file. There are good modules for this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With