I've created celery tasks to run some various jobs that were written in javascript by way of nodejs. The task is basically a subprocess.popen
that invokes nodejs.
The nodejs job will return a non-zero status when exiting, along with error information written to stderr.
When this occurs, I want to take the stderr, and return those as "results" to celery, along with a FAILURE
status, that way my jobs monitor can reflect that the job failed.
How can I do this?
This is my task
@app.task
def badcommand():
try:
output = subprocess.check_output('ls foobar',stderr=subprocess.STDOUT,shell=True)
return output
except subprocess.CalledProcessError as er:
#What do I do here to return er.output, and set the status to fail?
If I don't catch the subprocess exception, the Job properly fails, but the result is empty, and I get a traceback stacktrace instead.
If I catch the exception, and return er.output
the job completed as a success.
You can use celery.app.task.Task.update_state
method to update the current task state.
@app.task(bind=True)
def badcommand(self):
try:
output = subprocess.check_output('ls foobar',stderr=subprocess.STDOUT,shell=True)
return output
except subprocess.CalledProcessError as er:
self.update_state(state='FAILURE', meta={'exc': er})
Note that the bind
argument of the app.task
decorator was introduced in Celery 3.1. If you're still using a older version, I think you can call the update_state
task method this way:
@app.task
def badcommand():
...
except subprocess.CalledProcessError as er:
badcommand.update_state(state='FAILURE', meta={'exc': er})
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With