Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Executing Python Script as Celery Task

I am trying to run a python script as a celery task with Django. The issue I am having is that the task thinks it is complete as soon as the script begins running. I initially used subprocess.popen() in the tasks.py file, but realized this would mean the task would be complete as soon as the popen() command was issued. I modified my tasks.py code to call a function in my python script, which runs the script; however, this still executes as though the task is immediately complete. I am confused because in flower it says the task is complete, but in the celery log it is outputting the log data defined in the script I am running. I found the following related post. I believe I am following its suggestion to execute a python function from tasks.py.

tasks.py:

def exe(workDir, cancelRun):
    sys.path.append(workDir)
    import run

    if cancelRun=='True':
        task_id=exe.request.id
        revoke(task_id,terminate=True)
    else:
        run.runModel(workDir)
        task_id=exe.request.id
        return task_id

runModel function code:

def runModel(scendir):
    fullpath=scendir+'/run.py'
    os.chdir(scendir)
    p=Process(target=myMain,args=(scendir,))
    p.start()
    p.join()
like image 283
Jason Hawkins Avatar asked Nov 11 '22 05:11

Jason Hawkins


1 Answers

What happens if you run the command (the one executed by your task) on your terminal? If it returns immediatelly it's possible that it's daemonizing itself, let's say via fork().

Another way of testing the cause is using the Django shell: python manage.py shell, and then execute the necessary imports to call your functions manually.

like image 112
jweyrich Avatar answered Dec 26 '22 14:12

jweyrich