I'm trying to write a generator function that gets rows out of a database and returns them one at a time. However, I'm not sure if the cleanup code marked ** below executes as I think it does. If it does not, what is the best way to put cleanup code inside a generator itself that executes after the last yield statement? I looked at catching StopIteration but that seems to be done from the caller, not within the generator.
def MYSQLSelectGenerator(stmt):
...
try:
myDB = MySQLdb.connect(host=..., port=..., user=..., passwd=..., db=...)
dbc=myDB.cursor()
dbc.execute(stmt)
d = "asdf"
while d is not None:
d = dbc.fetchone() #can also use fetchmany() to be more efficient
yield d
dbc.close() #** DOES THIS WORK AS I INTEND, MEANING AS SOON AS d = "None"
except MySQLdb.Error, msg:
print("MYSQL ERROR!")
print msg
Your version will run dbc.close()
as soon as d is None
, but not if an exception gets raised. You need a finally
clause. This version is guaranteed to run dbc.close()
even if an exception gets raised:
try:
myDB = MySQLdb.connect(host=..., port=..., user=..., passwd=..., db=...)
dbc = myDB.cursor()
dbc.execute(stmt)
d = "asdf"
while d is not None:
d = dbc.fetchone() #can also use fetchmany() to be more efficient
yield d
except MySQLdb.Error, msg:
print("MYSQL ERROR!")
print msg
finally:
dbc.close()
One thing you could do is use a finally
clause. Another option (that may be overkill here but is a useful thing to know about) is to make a class that works with the with
statement:
class DatabaseConnection:
def __init__(self, statement):
self.statemet = statement
def __enter__(self):
self.myDB = MySQLdb.connect(host=..., port=...,user=...,passwd=...,db=...)
self.dbc = myDB.cursor()
self.dbc.execute(self.statement)
self.d = "asdf"
def __exit__(self, exc_type, exc_value, traceback):
self.dbc.close()
def __iter__(self):
while self.d is not None:
self.d = self.dbc.fetchone()
yield self.d
with DatabaseConnection(stmnt) as dbconnection:
for i in dbconnection:
print(i)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With