Let's have a class that has function that fails from time to time but after some actions it just works perfectly.
Real life example would be Mysql Query that raises _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away')
but after client reconnection it works fine.
I've tried to write decorator for this:
def _auto_reconnect_wrapper(func):
''' Tries to reconnects dead connection
'''
def inner(self, *args, _retry=True, **kwargs):
try:
return func(self, *args, **kwargs)
except Mysql.My.OperationalError as e:
# No retry? Rethrow
if not _retry:
raise
# Handle server connection errors only
# http://dev.mysql.com/doc/refman/5.0/en/error-messages-client.html
if (e.code < 2000) or (e.code > 2055):
raise
# Reconnect
self.connection.reconnect()
# Retry
return inner(self, *args, _retry=False, **kwargs)
return inner
class A(object):
...
@_auto_reconnect_wrapper
def get_data(self):
sql = '...'
return self.connection.fetch_rows(sql)
And if client loses connection it just silently reconnect and everybody is happy.
But what if I want to transform get_data()
to generator (and use yield
statement):
@_auto_reconnect_wrapper
def get_data(self):
sql = '...'
cursor = self.connection.execute(sql)
for row in cursor:
yield row
cursor.close()
Well, previous example won't work because inner function already returned generator and it will break after calling first next()
.
As I understand it if python sees yield
inside method it just yields control immediately (without executing one single statement) and waits for first next()
.
I've managed to make it work by replacing:
return func(self, *args, **kwargs)
With:
for row in func(self, *args, **kwargs):
yield row
But I'm curious whether there is more elegant (more pythonic) way to do this. Is there a way to make python run all the code up to first yield
and then wait?
I'm aware of possibility of just calling return tuple(func(self, *args, **kwargs))
but I want to avoid loading all records at once.
Usually, a decorator is any callable object that is used to modify the function (or) the class. A reference to the function (or) class is passed to the decorator and the decorator returns the modified function (or), class. Usually, the modified functions (or) classes contains the function (or) class call.
Iterators are objects which use the next() method to get the following values of a sequence. Generators are functions that produce or yield a sequence of values using the yield keyword.
Iterators are primarily used to iterate through other objects or convert them to iterators using the iter() method. Generators are commonly used in loops to create an iterator by returning all the data without impacting the loop's iteration. Iterator makes use of the iter() and next() methods.
First, I think the solution you're currently using is fine. When you decorate a generator, the decorator is going to need to at least behave like an iterator over that generator. Doing that by making the decorator a generator, too, is perfectly ok. As x3al pointed out, using yield from func(...)
instead of for row in func(...): yield row
is a possible optimization.
If you want to avoid actually making the decorator a generator, too, you can do that by using next
, which will run until the first yield
, and return the first yielded value. You'll need to make the decorator somehow capture and return that first value, in addition to the rest of the values to be yielded by the generator. You could do that with itertools.chain
:
def _auto_reconnect_wrapper(func):
''' Tries to reconnects dead connection
'''
def inner(self, *args, _retry=True, **kwargs):
gen = func(self, *args, **kwargs)
try:
value = next(gen)
return itertools.chain([value], gen)
except StopIteration:
return gen
except Mysql.My.OperationalError as e:
...
# Retry
return inner(self, *args, _retry=False, **kwargs)
return inner
You could also make the decorator work with both generator and non-generator functions, using inspect
to determine if you're decorating a generator:
def _auto_reconnect_wrapper(func):
''' Tries to reconnects dead connection
'''
def inner(self, *args, _retry=True, **kwargs):
try:
gen = func(self, *args, **kwargs)
if inspect.isgenerator(gen):
value = next(gen)
return itertools.chain([value], gen)
else: # Normal function
return gen
except StopIteration:
return gen
except Mysql.My.OperationalError as e:
...
# Retry
return inner(self, *args, _retry=False, **kwargs)
return inner
I would favor the yield
/yield from
-based solution, unless you have a requirement to decorate regular functions in addition to generators.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With