I've written a series of tests for my Django app, and would like to run them on a copy of my production database.
As far as I can tell, the best way to do this is using fixture loading like so:
However, this approach is cumbersome. I have multiple apps, and running manage.py dumpdata for each of them and manually moving around fixtures files every time I want to test my app is a pain.
Is there an easier way to automatically generate a copy of my entire production database and test my Django apps against it?
Open /catalog/tests/test_models.py.TestCase , as shown: from django. test import TestCase # Create your tests here. Often you will add a test class for each model/view/form you want to test, with individual methods for testing specific functionality.
The preferred way to write tests in Django is using the unittest module built-in to the Python standard library. This is covered in detail in the Writing and running tests document. You can also use any other Python test framework; Django provides an API and tools for that kind of integration.
Running tests in parallel To leverage them, you can use the --parallel flag. Django will create additional processes to run your tests and additional databases to run them against. You will see something like this: > python3 manage.py test --parallel --keepdb Using existing test database for alias 'default'...
django-test-migrationsSet some migration as a starting point. Create some model's data that you want to test. Run the new migration that you are testing. Assert the results!
Generally, testing against the live DB or a copy of the live DB is discouraged. Why? Because tests need to be predictable. When you make a copy of the live db, the input becomes unpredictable. The second problem is that you cannot obviously test on the live site, so you need to clone the data. That's slow for anything more than few MB in size.
Even if the DB is small, dumpdata
followed by loaddata
isn't the way. That's because dumpdata by default exports in a JSON format which has a large generating overhead, not to mention making the data file very bulky. Importing using loaddata is even slower.
The only realistic way to make a clone is using the database engines built in export/import mechanism. In the case of sqlite that's just copying the db file. For mysql it's SELECT INTO OUTFILE followed by LOAD DATA INFILE. And for postgresql it's COPY TO followed by COPY FROM and so on.
All of these export/import commands can be executed using the lowlevel connection object available in django and thus can be used to load fixtures.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With