I am running a celery worker like this:
celery worker --app=portalmq --logfile=/tmp/portalmq.log --loglevel=INFO -E --pidfile=/tmp/portalmq.pid
Now I want to run this worker in the background. I have tried several things, including:
nohup celery worker --app=portalmq --logfile=/tmp/portal_mq.log --loglevel=INFO -E --pidfile=/tmp/portal_mq.pid >> /tmp/portal_mq.log 2>&1 </dev/null &
But it is not working. I have checked the celery documentation, and I found this:
Specially this comment is relevant:
In production you will want to run the worker in the background as a daemon.
To do this you need to use the tools provided by your platform, or something
like supervisord (see Running the worker as a daemon for more information).
This is too much overhead just to run a process in the background. I would need to install supervisord in my servers, and get familiar with it. No go at the moment. Is there a simple way of running a celery worker in the backrground?
supervisor is really simple and requires really little work to get it setup up, same applies for to celery in combination with supervisor.
It should not take more than 10 minutes to setup it up :)
install supervisor with apt-get
create /etc/supervisor/conf.d/celery.conf config file
paste somethis in the celery.conf file
[program:celery]
directory = /my_project/
command = /usr/bin/python manage.py celery worker
plus (if you need) some optional and useful stuff (with dummy values)
user = celery_user
group = celery_group
stdout_logfile = /var/log/celeryd.log
stderr_logfile = /var/log/celeryd.err
autostart = true
environment=PATH="/some/path/",FOO="bar"
restart supervisor (or do supervisorctl reread; supervisorctl add celery)
after that you get the nice ctl commands to manage the celery process:
supervisorctl start/restart/stop celery
supervisorctl tail [-f] celery [stderr]
celery worker -A app.celery --loglevel=info --detach
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With