@celery.task
def my_task(my_object):
do_something_to_my_object(my_object)
#in the code somewhere
tasks = celery.group([my_task.s(obj) for obj in MyModel.objects.all()])
group_task = tasks.apply_async()
Question: Does celery have something to detect the progress of a group task? Can I get the count of how many tasks were there and how many have been processed?
Status. To run the following application, enter command python celery-task-queue.py , then open another terminal and enter command cd examples/queue-based-distribution/ to move to the relevant directory. Then, enter the command: celery -A celery-task-queue worker --queues celery,low-priority .
Monitoring Celery with a Simple Library Flower is a monitoring application for Celery that provides a Graphical User Interface (GUI) for looking a counts, trends, and diving deeper in to the tasks/messages responsible for the workers actions.
Celery task canvas Demonstration of a task which runs a startup task, then parallelizes multiple worker tasks, and then fires-off a reducer task.
tinkering around on the shell (ipython's tab auto-completion) I found that group_task
(which is a celery.result.ResultSet
object) had a method called completed_count
which gave exactly what I needed.
Also found the documentation at http://docs.celeryproject.org/en/latest/reference/celery.result.html#celery.result.ResultSet.completed_count
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With