Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it safe to combine 'with' and 'yield' in python?

It's a common idiom in python to use context manager to automatically close files:

with open('filename') as my_file:     # do something with my_file  # my_file gets automatically closed after exiting 'with' block 

Now I want to read contents of several files. Consumer of the data does not know or care if data comes from files or not-files. It does not want to check if the objects it received can be open or not. It just wants to get something to read lines from. So I create an iterator like this:

def select_files():     """Yields carefully selected and ready-to-read-from files"""     file_names = [.......]     for fname in file_names:         with open(fname) as my_open_file:             yield my_open_file 

This iterator may be used like this:

for file_obj in select_files():     for line in file_obj:         # do something useful 

(Note, that the same code could be used to consume not the open files, but lists of strings - that's cool!)

The question is: is it safe to yield open files?

Looks like "why not?". Consumer calls iterator, iterator opens file, yields it to consumer. Consumer processes the file and comes back to iterator for next one. Iterator code resumes, we exit 'with' block, the my_open_file object gets closed, go to next file, etc.

But what if consumer never comes back to iterator for the next file? F.e. an exception occurred inside the consumer. Or consumer found something very exciting in one of the files and happily returned the results to whoever called it?

Iterator code would never resume in this case, we would never come to the end of 'with' block, and the my_open_file object would never get closed!

Or would it?

like image 431
lesnik Avatar asked Jan 26 '17 19:01

lesnik


People also ask

Can I use yield and return in same function?

"return" and "yield" should not be used in the same function.

Can we use yield and return in same function Python?

Yes, it is still a generator. An empty return or return None can be used to end a generator function. It is equivalent to raising a StopIteration (see @NPE's answer for details). Note that a return with non-None arguments is a SyntaxError in Python versions prior to 3.3.

Can I yield two values Python?

Conclusion. Like other programming languages, Python can return a single value, but in this, we can use yield statements to return more than one value for the function. The function that uses the yield keyword is known as a generator function.

Does yield stop execution Python?

The main difference between them is: In python the return statement stops the execution of the function. Whereas, the yield statement only pauses the execution of the function.


1 Answers

You bring up a criticism that has been raised before1. The cleanup in this case is non-deterministic, but it will happen with CPython when the generator gets garbage collected. Your mileage may vary for other python implementations...

Here's a quick example:

from __future__ import print_function import contextlib  @contextlib.contextmanager def manager():     """Easiest way to get a custom context manager..."""     try:         print('Entered')         yield     finally:         print('Closed')   def gen():     """Just a generator with a context manager inside.      When the context is entered, we'll see "Entered" on the console     and when exited, we'll see "Closed" on the console.     """     man = manager()     with man:         for i in range(10):             yield i   # Test what happens when we consume a generator. list(gen())  def fn():     g = gen()     next(g)     # g.close()  # Test what happens when the generator gets garbage collected inside # a function print('Start of Function') fn() print('End of Function')  # Test what happens when a generator gets garbage collected outside # a function.  IIRC, this isn't _guaranteed_ to happen in all cases. g = gen() next(g) # g.close() print('EOF') 

Running this script in CPython, I get:

$ python ~/sandbox/cm.py Entered Closed Start of Function Entered Closed End of Function Entered EOF Closed 

Basically, what we see is that for generators that are exhausted, the context manager cleans up when you expect. For generators that aren't exhausted, the cleanup function runs when the generator is collected by the garbage collector. This happens when the generator goes out of scope (or, IIRC at the next gc.collect cycle at the latest).

However, doing some quick experiments (e.g. running the above code in pypy), I don't get all of my context managers cleaned up:

$ pypy --version Python 2.7.10 (f3ad1e1e1d62, Aug 28 2015, 09:36:42) [PyPy 2.6.1 with GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] $ pypy ~/sandbox/cm.py Entered Closed Start of Function Entered End of Function Entered EOF 

So, the assertion that the context manager's __exit__ will get called for all python implementations is untrue. Likely the misses here are attributable to pypy's garbage collection strategy (which isn't reference counting) and by the time pypy decides to reap the generators, the process is already shutting down and therefore, it doesn't bother with it... In most real-world applications, the generators would probably get reaped and finalized quickly enough that it doesn't actually matter...


Providing strict guarantees

If you want to guarantee that your context manager is finalized properly, you should take care to close the generator when you are done with it2. Uncommenting the g.close() lines above gives me deterministic cleanup because a GeneratorExit is raised at the yield statement (which is inside the context manager) and then it's caught/suppressed by the generator...

$ pypy ~/sandbox/cm.py Entered Closed Start of Function Entered Closed End of Function Entered Closed EOF  $ python3 ~/sandbox/cm.py Entered Closed Start of Function Entered Closed End of Function Entered Closed EOF  $ python ~/sandbox/cm.py Entered Closed Start of Function Entered Closed End of Function Entered Closed EOF 

FWIW, this means that you can clean up your generators using contextlib.closing:

from contextlib import closing with closing(gen_function()) as items:     for item in items:         pass # Do something useful! 

1Most recently, some discussion has revolved around PEP 533 which aims to make iterator cleanup more deterministic.
2It is perfectly OK to close an already closed and/or consumed generator so you can call it without worrying about the state of the generator.

like image 156
mgilson Avatar answered Oct 06 '22 14:10

mgilson