I am trying to run a python script as a celery task with Django. The issue I am having is that the task thinks it is complete as soon as the script begins running. I initially used subprocess.popen() in the tasks.py file, but realized this would mean the task would be complete as soon as the popen() command was issued. I modified my tasks.py code to call a function in my python script, which runs the script; however, this still executes as though the task is immediately complete. I am confused because in flower it says the task is complete, but in the celery log it is outputting the log data defined in the script I am running. I found the following related post. I believe I am following its suggestion to execute a python function from tasks.py.
tasks.py:
def exe(workDir, cancelRun):
sys.path.append(workDir)
import run
if cancelRun=='True':
task_id=exe.request.id
revoke(task_id,terminate=True)
else:
run.runModel(workDir)
task_id=exe.request.id
return task_id
runModel function code:
def runModel(scendir):
fullpath=scendir+'/run.py'
os.chdir(scendir)
p=Process(target=myMain,args=(scendir,))
p.start()
p.join()
What happens if you run the command (the one executed by your task) on your terminal? If it returns immediatelly it's possible that it's daemonizing itself, let's say via fork()
.
Another way of testing the cause is using the Django shell: python manage.py shell
, and then execute the necessary imports to call your functions manually.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With