I found some legacy codes that whenever sending a query, it gets current configuration from the current app and creates an engine from URI and then make a session. For example, if one API send like 9 queries, it creates engine nine times and makes sessions and then closes the session. I wonder if it affects performance because make engine multiple times looks inefficient since this application uses only two mysql databases (master and slave).
You can easily use the timeit
built in module to test things like this. For example:
C:\test> python -m timeit -u msec -s "from sqlalchemy import create_engine" "engine=create_engine('sqlite:///');engine.execute('SELECT 1')"
returns:
200 loops, best of 5: 1.21 msec per loop
and:
C:\test> python -m timeit -u msec -s "from sqlalchemy import create_engine;engine=create_engine('sqlite:///')" "engine.execute('SELECT 1')"
returns:
5000 loops, best of 5: 0.0371 msec per loop
Note that the difference between the two is in the second statement, the engine=create_engine('sqlite:\\\')
statement is moved to the setup code.
So from this test, executing a sql statement from a globally defined engine is ~33x faster than building the engine for every statement issued.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With