I need to run a process that might take hours to complete from a Django view. I don't need to know the state or communicate with it but I need that view to redirect away right after starting the process.
I've tried using subprocess.Popen
, using it within a new threading.Thread
, multiprocessing.Process
. However, the parent process keeps hanging until child terminates. The only way that almost gets it done is using a fork. Obviously that isn't good as it leaves a zombie process behind until parent terminates.
That's what I'm trying to do when using fork:
if os.fork() == 0:
subprocess.Popen(["/usr/bin/python", script_path, "-v"])
else:
return HttpResponseRedirect(reverse('view_to_redirect'))
So, is there a way to run a completely independent process from a Django view with minimal casualties? Or am I doing something wrong?
Celery makes it easier to implement the task queues for many workers in a Django application.
Celery is a task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well. The execution units, called tasks, are executed concurrently on a single or more worker servers.
Django Q is a native Django task queue, scheduler and worker application using Python multiprocessing.
I don't know if this will be suitable for your case, nevertheless here is what I do: I use a task queue (via a django model); when the view is called, it enters a new record in the tasks and redirects happily. Tasks in turn are executed by cron on a regular basis independently from django.
Edit: cron calls the relevant (and custom) django command to execute the task.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With