I have a function in my Django views.py that looks like this.
def process(request):
form = ProcessForm(request.POST, request.FILES)
if form.is_valid():
instance = form.save(commit=False)
instance.requested_by = request.user
instance.save()
t = threading.Thread(target=utils.background_match, args=(instance,), kwargs={})
t.setDaemon(True)
t.start()
return HttpResponseRedirect(reverse('mart:processing'))
Here, I'm trying to call a function 'background_match' in a separate thread when ProcessForm is submitted. Since this thread takes some time to complete, I redirect the user to another page named 'mart:processing'.
The problem I am facing is that it all works fine in my local machine but doesn't work on production server which is an AWS EC2 instance. The thread doesn't start at all. There's a for loop inside the background_match function which doesn't move forward.
However, if I refresh (CTRL + R) the 'mart:processing' page, it does move by 1 or 2 iterations. So, for a complete loop consisting of 1000 iterations to run, I need to refresh the page 1000 times. If after, say, 100 iterations I don't refresh the page it gets stuck at that point and doesn't move to the 101st iteration. Please help!
Wrong architecture. Django and other web apps should be spawning threads like this. The correct way is to create an async task using a task queue. The most popular task queue for django happens to be Celery.
The mart:processing
page should then check the async result to determine if the task has been completed. A rough sketch is as follows.
from celery.result import AsynResult
from myapp.tasks import my_task
...
if form.is_valid():
...
task_id = my_task()
request.session['task_id']=task_id
return HttpResponseRedirect(reverse('mart:processing'))
...
On the subsequent page
task_id = request.session.get('task_id')
if task_id:
task = AsyncResult(task_id)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With