I'm trying to write some unit tests for some celery tasks in my Django app. These tasks take a model id as the argument, do some stuff, and update the model. When running a devserver and celery worker, everything works great, but when running my tests, it has become clear that the celery task is not using the django test db that gets created and destroyed as part of the test run. Question is, how can I get celery to use the same temporary db as the rest of my tests?
As you can see, I'm using the settings overrides that are suggested in every answer for similar issues.
UPDATE: Discovered that instead of passing the object id to the task and having the task get it from the db, if I simply pass the object itself to task, the tests work correctly with apparently no adverse effects on the functioning of the task. So at least for now, that will be my fix.
In my test:
class JobTest(TestCase):
@override_settings(CELERY_ALWAYS_EAGER=True,
CELERY_EAGER_PROPAGATES_EXCEPTIONS=True,
BROKER_BACKEND='memory')
def test_Job_Complete(self):
job = models.Job()
job.save()
tasks.do_a_thing(job.id)
self.assertTrue(job.complete)
In my task:
@celery.task
def do_a_thing(job_id):
job = models.Job.objects.get(pk=job_id)
bunch_of_things(job)
job.complete = True
job.save()
One way to guarantee that the Celery worker is configured to use the same test database as the tests is to spawn the Celery worker inside the test itself. This can be done by using start_worker
from celery.contrib.testing.worker import start_worker
from myproject.celery import app
def setUpClass(self):
start_worker(app)
method of the TestCase
.
You have to also use a SimpleTestCase
from Django or an APISimpleTestCase
from Rest rather than a plain TestCase
so that the Celery thread and the test thread can see the changes that each other make to the test database. The changes are still destroyed at the end of testing, but they are not destroyed between tests unless you manually destroy them in the tearDown
method.
I battled with a similar problem. The following solution is not clean but it works.
integration_testing.py
.Your file should look like this:from .settings import *
DATABASES = {
'default': {
'ENGINE': '<your engine>',
'NAME': 'test_<your database name>',
'USER': <your db user>,
'PASSWORD': <your db password>,
'HOST': <your hostname>,
'PORT': <your port number>,
}
Create a shell script which will set your environment and start up the celery worker:
#!/usr/bin/env bash
export DJANGO_SETTINGS_MODULE="YOURPROJECTNAME.settings.integration_testing"
celery purge -A YOURPROJECTNAME -f && celery worker -A YOURPROJECTNAME -l debug
The above works if you configured celery in this manner:
app = Celery('YOURPROJECTNAME')
app.config_from_object('django.conf:settings', namespace='CELERY')
Run the script in the background.
Make all tests that involve Celery inherit from TransactionTestCase (or APITransactionTestCase in django-rest-framework)
Run your unit tests that use celery. Any celery tasks will now use your test db. And hope for the best.
There's no obvious problem with your code. You don't need to run a celery worker. With these settings celery will run the task synchronously and won't actually send anything to your message queue.
You can't easily run tests with live celery workers anyway because each test is wrapped in a transaction so even if they were connecting to the same database (which they aren't) the transactions are always rolled back by the test and never available to the worker.
If you really need to do this, look at this stackoverflow answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With