Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How should I establish and manage database connections in a multi-module Python app?

Tags:

python

mysql

We have a Python application with over twenty modules, most of which are shared by several web and console applications.

I've never had a clear understanding of the best practice for establishing and managing database connection in multi module Python apps. Consider this example:

I have a module defining an object class for Users. It has many defs for creating/deleting/updating users in the database. The users.py module is imported into a) a console based utility, 2) a web.py based web application and 3) a constantly running daemon process.

Each of these three application have different life cycles. The daemon can open a connection and keep it open. The console utility connects, does work, then dies. Of course the http requests are atomic, however the web server is a daemon.

I am currently opening, using then closing a connection inside each function in the Users class. This seems the most inefficient, but it works in all examples. An alternative used as a test is to declare and open a global connection for the entire module. Another option would be to create the connection at the top application layer and pass references when instantiating classes, but this seems the worst idea to me.

I know every application architecture is different. I'm just wondering if there's a best practice, and what it would be?

like image 228
hikaru Avatar asked Feb 14 '13 20:02

hikaru


People also ask

How does Python manage database connection?

To create a connection between the MySQL database and Python, the connect() method of mysql. connector module is used. We pass the database details like HostName, username, and the password in the method call, and then the method returns the connection object.


1 Answers

The best method is to open a connection when you need to do some operations (like getting and/or updating data); manipulate the data; write it back to the database in one query (very important for performance), and then close the connection. Opening a connection is a fairly light process.

Some pitfalls for performance include

  • opening the database when you won't definitely interact with it
  • using selectors that take more data than you need (e.g., getting data about all users and filtering it in Python, instead of asking MySQL to filter out the useless data)
  • writing values that haven't changed (e.g. updating all values of a user profile, when just their email has changed)
  • having each field update the server individually (e.g., open the db, update the user email, close the db, open the db, update the user password, close the db, open th... you get the idea)

The bottom line is that it doesn't matter how many times you open the database, it's how many queries you run. If you can get your code to join related queries, you've won the battle.

like image 183
Brigand Avatar answered Oct 26 '22 16:10

Brigand