Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django Celery receiving and accepting tasks, but not executing them

I'm running a Django 1.11 (built using Cookiecutter-Django template) server on Digital Ocean running Ubuntu 16.04, Gunicorn, Nginx, and am trying to set up Celery tasks using Redis. The service seems to work and receive periodic tasks fine when I do:

celery -A config worker -B -l debug

And the tasks are received and accepted, but they don't execute. To test, I'm sending this function:

@shared_task(name="sum_two_numbers")
def add(x, y, **kwargs):
    return x + y

with:

add.delay(1,3)

And this is the complete printout of the console that Celery is running on:

 -------------- celery@myproject v4.1.0 (latentcall)
---- **** -----
--- * ***  * -- Linux-4.4.0-112-generic-x86_64-with-Ubuntu-16.04-xenial 2018-02-19 23:18:12
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         myproject:0x7f2cd60dc9e8
- ** ---------- .> transport:   redis://127.0.0.1:6379//
- ** ---------- .> results:     redis://localhost:6379/
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . . .
  . sum_two_numbers

[2018-02-19 23:18:12,858: INFO/MainProcess] Connected to redis://127.0.0.1:6379//
[2018-02-19 23:18:12,876: INFO/MainProcess] mingle: searching for neighbors
[2018-02-19 23:18:13,910: INFO/MainProcess] mingle: all alone
[2018-02-19 23:18:13,924: WARNING/MainProcess] /home/user/.virtualenvs/myproject/lib/python3.5/site-packages/celery/fixups/django.py:202: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2018-02-19 23:19:38,714: INFO/MainProcess] Received task: sum_two_numbers[ab5b5547-1337-4dec-8848-c15e1a194b51]
[2018-02-19 23:19:38,715: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f2cd5fce510> (args:('sum_two_numbers', 'ab5b5547-1337-4dec-8848-c15e1a194b51', {'root_id': 'ab5b5547-1337-4dec-8848-c15e1a194b51', 'task': 'sum_two_numbers', 'group': None, 'correlation_id': 'ab5b5547-1337-4dec-8848-c15e1a194b51', 'id': 'ab5b5547-1337-4dec-8848-c15e1a194b51', 'timelimit': [None, None], 'expires': None, 'retries': 0, 'argsrepr': '(1, 3)', 'eta': None, 'origin': 'gen23535@myproject', 'reply_to': 'e67c54ef-3c66-3720-9e1f-62ef3d76882d', 'kwargsrepr': '{}', 'lang': 'py', 'parent_id': None, 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': ''}}, b'[[1, 3], {}, {"errbacks": null, "chain": null, "chord": null, "callbacks": null}]', 'application/json', 'utf-8') kwargs:{})
[2018-02-19 23:19:38,722: DEBUG/MainProcess] Task accepted: sum_two_numbers[ab5b5547-1337-4dec-8848-c15e1a194b51] pid:23512

When I run locally, it works just fine. What am I doing wrong here?

like image 814
halsdunes Avatar asked Feb 20 '18 04:02

halsdunes


People also ask

How does Celery execute tasks?

Once you integrate Celery into your app, you can send time-intensive tasks to Celery's task queue. That way, your web app can continue to respond quickly to users while Celery completes expensive operations asynchronously in the background.


1 Answers

You can try with this command -

celery -A <App_name> worker -l info --without-gossip --without-mingle --without-heartbeat -Ofair --pool=solo

This solved m issue by pooling in the solo mood. Maybe this is your case as well.

like image 140
Mobasshir Bhuiya Avatar answered Sep 25 '22 03:09

Mobasshir Bhuiya