I have a complex Django web application that has many person-years of work put into it. It might need optimisation sometime. There are several common operation/flows that I could script with (say) django's test client. Is there some programme that, given a python script like that, will run then, and report on various django specific performance metrics, like 'number of sql queries run'.
Essentially something like a unittest test suite, but rather than reporting "0 tests failed", it'd report "X db queries were made"
I could write this myself, it's not exactly a complex problem, but I wonder has anyone done it before.
I know about Django Debug Toolbar, which can do a lot of this already, but is there something more 'command line' and works on many pages, rather than one page refresh. Likewise getting the actual queries is relatively easy. But has anyone wrapped the whole thing up in a script/library?
django-debug-toolbar is a very handy tool that provides insights into what your code is doing and how much time it spends doing it.
from the docs: When using SQLite, the tests will use an in-memory database by default (i.e., the database will be created in memory, bypassing the filesystem entirely!). The TEST dictionary in DATABASES offers a number of settings to configure your test database.
If you are writing a reusable application you may want to use the Django test runner to run your own test suite and thus benefit from the Django testing infrastructure. A common practice is a tests directory next to the application code, with the following structure: runtests. py polls/ __init__. py models.
Unit Tests are isolated tests that test one specific function. Integration Tests, meanwhile, are larger tests that focus on user behavior and testing entire applications. Put another way, integration testing combines different pieces of code functionality to make sure they behave correctly.
You can make a TestCase ancestor, something like PerformanceTestCase, which uses setUp() to start the timer and tearDown() to measure time taken and sql queries, and then output wherever you like.
class PerformanceTestCase(TestCase):
def setUp(self):
self.begin_time = datetime.datetime.now()
def tearDown(self):
delta = datetime.datetime.now() - self.begin_time
print 'Time taken', delta.seconds
from django.db import connection
print 'SQL queries', len(connection.queries)
Maybe you'll need to reset the connection, but i think it's being reset between tests.
Use something like graphite or opentsdb combined with something like statsd for non-blocking stats that allow you to measure anything, and plot them in realtime. The best part is that it lets your engineers easily plot whatever they need. Hooked up with collectd, you can graph your apps against memory/cpu usage, db queries.
Here is a sample image from a blog article on how etsy is using graphite:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With