Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to have django give a HTTP response before continuing on to complete a task associated to the request?

Tags:

yield

django

In my django piston API, I want to yield/return a http response to the the client before calling another function that will take quite some time. How do I make the yield give a HTTP response containing the desired JSON and not a string relating to the creation of a generator object?

My piston handler method looks like so:

def create(self, request):
    data = request.data 

    *other operations......................*

    incident.save()
    response = rc.CREATED
    response.content = {"id":str(incident.id)}
    yield response
    manage_incident(incident)

Instead of the response I want, like:

   {"id":"13"}

The client gets a string like this:

 "<generator object create at 0x102c50050>"

EDIT:

I realise that using yield was the wrong way to go about this, in essence what I am trying to achieve is that the client receives a response right away before the server moves onto the time costly function of manage_incident()

like image 797
Dangermouse Avatar asked Jul 07 '11 16:07

Dangermouse


2 Answers

This doesn't have anything to do with generators or yielding, but I've used the following code and decorator to have things run in the background while returning the client an HTTP response immediately.

Usage:

@postpone
def long_process():
    do things...

def some_view(request):
    long_process()
    return HttpResponse(...)

And here's the code to make it work:

import atexit
import Queue
import threading

from django.core.mail import mail_admins


def _worker():
    while True:
        func, args, kwargs = _queue.get()
        try:
            func(*args, **kwargs)
        except:
            import traceback
            details = traceback.format_exc()
            mail_admins('Background process exception', details)
        finally:
            _queue.task_done()  # so we can join at exit

def postpone(func):
    def decorator(*args, **kwargs):
        _queue.put((func, args, kwargs))
    return decorator

_queue = Queue.Queue()
_thread = threading.Thread(target=_worker)
_thread.daemon = True
_thread.start()

def _cleanup():
    _queue.join()   # so we don't exit too soon

atexit.register(_cleanup)
like image 132
Dan Breen Avatar answered Sep 23 '22 17:09

Dan Breen


Perhaps you could do something like this (be careful though):

import threading
def create(self, request):
    data = request.data 
    # do stuff...
    t = threading.Thread(target=manage_incident,
                         args=(incident,))
    t.setDaemon(True)
    t.start()
    return response

Have anyone tried this? Is it safe? My guess is it's not, mostly because of concurrency issues but also due to the fact that if you get a lot of requests, you might also get a lot of processes (since they might be running for a while), but it might be worth a shot.

Otherwise, you could just add the incident that needs to be managed to your database and handle it later via a cron job or something like that.

I don't think Django is built either for concurrency or very time consuming operations.

Edit

Someone have tried it, seems to work.

Edit 2

These kind of things are often better handled by background jobs. The Django Background Tasks library is nice, but there are others of course.

like image 23
André Laszlo Avatar answered Sep 25 '22 17:09

André Laszlo