I followed the first steps with Celery (Django) and trying to run a heavy process in the background. I have RabbitMQ server installed. However, when I try,
celery -A my_app worker -l info
it throws the following error
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "c:\anaconda3\lib\site-packages\celery\concurrency\prefork.py", line
18, in <module>
from celery.concurrency.base import BasePool
File "c:\anaconda3\lib\site-packages\celery\concurrency\base.py", line 15,
in <module>
from celery.utils import timer2
File "c:\anaconda3\lib\site-packages\celery\utils\timer2.py", line 16, in
<module>
from kombu.asynchronous.timer import Entry
ModuleNotFoundError: No module named 'kombu.asynchronous.timer'
I've searched a lot, but can't seem to get it working. Any help will be highly appreciated. Thank you!
So I am running windows 10, python3.9.x, using aws sqs as the broker, just finished updating some files:
settings.py
###
### For celery tasks!
###
from kombu.utils.url import safequote
import urllib.parse
AWS_ACCESS_KEY_ID = 'my aws_access_key_id for a user'
AWS_SECRET_ACCESS_KEY = 'my aws_secret_access_key for a user'
BROKER_URL = 'sqs://%s:%s@' % (urllib.parse.quote(AWS_ACCESS_KEY_ID, safe=''), urllib.parse.quote(AWS_SECRET_ACCESS_KEY, safe=''))
BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT_OPTIONS = {
'canves-celery-queue': {
'access_key_id': safequote(AWS_ACCESS_KEY_ID),
'secret_access_key': safequote(AWS_SECRET_ACCESS_KEY),
'region': 'us-east-1'
}
}
CELERY_DEFAULT_QUEUE = 'celery<-project-queue>'
CELERY_QUEUES = {
CELERY_DEFAULT_QUEUE: {
'exchange': CELERY_DEFAULT_QUEUE,
'binding_key': CELERY_DEFAULT_QUEUE,
}
}
###
### End celery tasks
###
celery_tasks.py (referred to in the tutorial as celery.py - renamed because apparently that caused some other programmers some errors):
from __future__ import absolute_import
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '<project>.settings')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app = Celery('<project>', include=['<project>.tasks'])
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print("debug_task was fired!")
print(f'Request: {self.request!r}')
if __name__ == '__main__':
app.start()
tasks.py (this is in the same directory as the settings.py - also referrenced in celery_tasks.py)
from __future__ import absolute_import
from .celery_tasks import app
import time
@app.task(ignore_result=True)
def sleep(x, y):
print("Sleeping for: " + str(x + y))
time.sleep(x + y)
print("Slept for: " + str(x + y))
When I went to run the worker (make sure you are in the same directory as manage.py), it threw this error:
from kombu.async.timer import Entry, Timer as Schedule, to_timestamp, logger
to fix it, I ran as per gogaz's answer
pip uninstall django-celery
pip uninstall celery && pip install celery
which pushed me to the latest version of celery, 4.3... celery 4+ isn't supported on windows as per this SO question (Celery raises ValueError: not enough values to unpack), which conveniently has this answer (posted by Samuel Chen):
for celery 4.2+, python3, windows 10
pip install gevent
celery -A <project> worker -l info -P gevent
for celery 4.1+, python3, windows 10
pip install eventlet
celery -A <project> worker -l info -P eventlet
The only other error I get is from django's debugger being on, which apparently causes memory leaks...
The problem (for me at least) is that I can't use the Prefork pool, which means that I can't use app.control.revoke() to terminate tasks.
---EDIT---
Also worth mentioning that after this answer was posted, I switched to a linux box. Unknown to me, due to a lack of experience, there are different modes you can run background tasks in. I don't remember all the names, but if you type into google "celery multithreading vs gevent", it will likely come back with some other modes you can run celery in, their purposes and which ones are supported for each platform. Windows couldn't run the mode that I thought made the most sense for my problem (I believe it was multithreading), and that was a real issue. However linux can run all of them... so I switched back to linux, just for celery. I had some problems with DJango in a redhat environment, so I had to fix those issues as well :|
I tested celery on the same python version you have and it is okay. and also https://github.com/celery/kombu/blob/master/kombu/asynchronous/timer.py shows that renaming things randomly is not going to help you. Maybe you should try pip uninstall kombu && pip --no-cache-dir install -U kombu
to perform a fresh install for kombu
. I guess there must be something wrong with your installation. so if the kombu
reinstall thing didn't work, try installing the whole thing again.
I had this issue with the default Celery installation from pip (3.1.26Post2). As mentionned above, I installed instead version 3.1.25, but Celery was still not working. Thus I explicitly installed the latest version:
pip install Celery==4.3
and everything is working now!
I landed here after I tried to install django-celery
while reading celery 4.4 documentation, this package forces celery version to 3.1.26.post2
, so I had to:
pip uninstall django-celery
pip uninstall celery && pip install celery # Uninstall 3.1 and install latest
As documentation clearly says:
Django is supported out of the box now so this document only contains a basic way to integrate Celery and Django. You’ll use the same API as non-Django users so you’re recommended to read the First Steps with Celery tutorial first and come back to this tutorial.
I have the same problem, but solved it when reinstall celery with version 3.1.25
pip uninstall celery && pip install celery==3.1.25
Maybe because windows is not officially supported by celery 4, https://github.com/celery/celery/issues/3551
TL;DR: remove the kombu directory from the root of your virtualenv (if it exists). It may only fail on Windows.
It seems to be a quirk. I found the same error and I checked out what was happening.
The wheel package that pip downloads looks fine (kombu.asynchronous.timer exists in it). The release for the last version (currently 4.2.0) also is fine. What was strange is what I found in my virtualenv installation.
I found a kombu directory at my virtualenv root which has the content of the library but it also has an "async" directory, alongside an "asynchronous" one. These directories aren't from the 4.2.0 release, as async has the timer.py file but asynchronous doesn't.
From where did it come? It appears that from the wheel's data directory.
So, the solution: I removed the kombu directory from the root of my virtualenv and celery worked.
I just started with Celery. I followed instructions and installed Celery v 4.2.0
when I was trying to run the command :
celery -A mysite worker -l info
I got the error :
ModuleNotFoundError: No module named 'kombu.asynchronous.timer
I removed Celery installation : pip uninstall celery
Afterwards installed Celery 3.1.25 as 'chuhy' recommended
but..It had some other issues, so I immediately un-installed 3.1.25 , and reinstalled celery v4.2.0 .
After this scenario the error didn't pop again.
I have faced similar type of issue this is because of older version of celery. Uninstall celery (pip uninstall celery) and install again (pip install Celery==4.3) and kaboom it will work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With