Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

unable to execute Celery beat the second time

I am using Celery beat for getting the site data after every 10 seconds. Therefore I update the settings in my Django project. I am using rabbitmq with celery.

settings.py

# This is the settings file
# Rabbitmq configuration
BROKER_URL = "amqp://abcd:abcd@localhost:5672/abcd"

# Celery configuration
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'


CELERYBEAT_SCHEDULE = {
    # Executes every Monday morning at 7:30 A.M
    'update-app-data': {
        'task': 'myapp.tasks.fetch_data_task',
        'schedule': timedelta(seconds=10),
        },

celery.py

from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings

# Indicate Celery to use the default Django settings module
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myapp')
app.config_from_object('django.conf:settings')
# This line will tell Celery to autodiscover all your tasks.py that are in
# playstore folders
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

app_keywords = Celery('keywords')
app_keywords.config_from_object('django.conf:settings')
# This line will tell Celery to autodiscover all your tasks.py that are in
# keywords folders
app_keywords.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


app1 = Celery('myapp1')
app1.config_from_object('django.conf:settings')
# This line will tell Celery to autodiscover all your tasks.py that are in
# your app folders
app1.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

tasks.py

@task(bind=True)
def fetch_data_task(self, data):
    logger.info("Start task")
    import pdb;pdb.set_trace()
    # post the data to view
    headers, cookies = utils.get_csrf_token()
    requests.post(settings.SITE_VARIABLES['site_url'] + "/site/general_data/",
                  data=json.dumps(data), headers=headers, cookies=cookies
                  )
    if data['reviews']:
        reviews_data = {'app_id': data['app_data'][
            'app_id'], 'reviews': data['reviews'][0]}
        requests.post(settings.SITE_VARIABLES['site_url'] + "/site/blog/reviews/",
                      data=json.dumps(reviews_data), headers=headers, cookies=cookies
                      )
    logger.info("Task fetch data finished")

Now once I call fetch_data_task in my api after login to the site, The task is queued in rabbimq and then It should the call the function along with the arguments.

Here is the line where I am calling the task for the very first time

tasks.fetch_data_task.apply_async((data,))

This queues the task and the task executes each time but it gives me the following error

[2016-09-13 18:57:43,044: ERROR/MainProcess] Task playstore.tasks.fetch_data_task[3b88c6d0-48db-49c1-b7d1-0b8469775d53]

raised unexpected: TypeError("fetch_data_task() missing 1 required positional argument: 'data'",)

Traceback (most recent call last):

File "/Users/chitrankdixit/.virtualenvs/hashgrowth-> >dev/lib/python3.5/site-packages/celery/app/trace.py", line 240, in >trace_task R = retval = fun(*args, **kwargs) File "/Users/chitrankdixit/.virtualenvs/hashgrowth->dev/lib/python3.5/site-packages/celery/app/trace.py", line 438, in >protected_call return self.run(*args, **kwargs) TypeError: fetch_data_task() missing 1 required positional argument: 'data'

If anyone has worked with celery and rabbitmq and also worked with periodic task using celery please suggest me to execute the tasks properly.

like image 939
Chitrank Dixit Avatar asked Nov 08 '22 09:11

Chitrank Dixit


1 Answers

The exception tells you what the error is: your task expects a positional argument, but you do not provide any arguments in your schedule definition.

CELERYBEAT_SCHEDULE = {
    # Executes every Monday morning at 7:30 A.M
    'update-app-data': {
        'task': 'myapp.tasks.fetch_data_task',
        'schedule': timedelta(seconds=10),
        'args': ({
            # whatever goes into 'data' 
        },)  # tuple with one entry, don't omit the comma
    },

Calling the task from any other place in your code does not have any effect on the schedule.

like image 175
Daniel Hepper Avatar answered Nov 14 '22 21:11

Daniel Hepper