I'm currently developing a Python script that does a few things with some data pulled from a MySQL database. To access this data, I'm using the module MySQLdb.
This module follows the guidelines laid out in PEP 249, the Python DB API, and involves creating a connection object, and a subsequent cursor object, which is used to iterate through the information.
Currently, In my project, I create a connection object any time I need to do a block of MySQL reading/writing, then close it when I'm done. However, I could easily pass the connection object around in order to avoid these repeated open/closes.
My question is: Taking into account security, resource management, etc., is the open; read/write; close; repeat for the next read/write;
methodology better than the open; read/write; pass connection on for the next read/write;
approach?
EDIT: Some more context. This specific Python script is heavily multithreaded. Does a complex process/thread environment affect which method is more apropos?
a multithreaded application that uses database connections should probably use some sort of connection pool. in this scheme, you have a predifined number of connections, all managed in some sort of queue. When a thread needs to run a transaction, it gets a connection from the pool. If the pool is currently idle, then a new connection will be created for the thread; If it's very busy, it will make the thread wait until another thread frees a connection. when the thread is done with a connection, it returns the connection to the pool, which hands it off to another thread, or if there are no threads waiting for a connection, it closes it.
There are probably dozens of implementations of this logic, but I can highly recommend SQLAlchemy, and for more than just it's connection management (although you don't have to use it for much more than that)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With