So I've configured Celery+RabbitMQ. Everything works.
Added --autoreload
option to the celery -A proj worker --loglevel=debug
command and logging stops on this:
[2014-09-11 19:22:00,447: DEBUG/MainProcess] | Worker: Hub.register Autoreloader...
Without it:
[2014-09-11 19:37:34,316: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2014-09-11 19:37:34,317: DEBUG/MainProcess] basic.qos: prefetch_count->16
[2014-09-11 19:37:36,275: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2014-09-11 19:37:36,275: INFO/MainProcess] Events of group {task} enabled by remote.
Everything else works just fine. RabbitMQ recieves a message after
celery call tasks.update
There are connections from Celery. It just doesn't tell Celery to start a task.
It looks like a connection problem, but I don't know what it is.
If you can help with it please write.
There is a bug in celery at celery/worker/autoreload.py line 67
This bug fix has been commited to master: https://github.com/pashinin/celery/commit/92b52db6eeeb75494700ffe807ecd4c1fe6b0643
you can patch the library by changing line 67 of autoreload.py from
for chunk in iter(lambda: f.read(2 ** 20), ''):
to
for chunk in iter(lambda: f.read(2 ** 20), b''):
After this change, you may still face one more issue: If you already have one task created, it will NOT be recreated and that task will not to be updated after module reload. Reloaded tasks become active only after you execute them one more time.
Celery developers don't seem to be willing to fix this issue early on. Until then you have two options:
The updated task needs to be executed before a new one is picked up.
Restart the celery worker after a task being updated
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With