From my reading today, in all the examples I found I didn't see any where celery is in a completely separate container from Django itself. It seems as though Celery has to be in the same container since it walks the apps source files and looks for tasks.py as well as the initial celery.py Is that correct or did I misread today?
For example. I am familiar with using docker-compose to spin up Django, Nginx, Postgres and a storage container. I assumed I'b be adding a celery and rabbitmq container, but I see no way to configure Django to use a remote Celery server.
I'm still early in my understanding of Celery, I hope this isn't something I overlooked elsewhere.
Thanks,
-p
If we try to run Celery on Windows, we will run into a problem: Windows is not officially supported by Celery.
Here are some key points: DJANGO_SETTINGS_MODULE must be set in the environment before starting a Celery process. Its presence in the environment triggers internal magic in Celery to run the Django setup at the right time. The Celery "application" must be created and configured during startup of both Django and Celery.
By default, that's what happen if you use Heroku, it run a web Dyno for django to respond to requests, and an other worker Dyno for Celery, each Dyno run on a separate instance.
Both Dynos run the same code, your celery worker need to access the models, and it's easy to manage/deploy one code base, but there is nothing stopping you from using different code base for each instance, as the communication between Django and Celery is done with AMQP protocol throw a Broker like Reddis.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With