Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using python's Multiprocessing makes response hang on gunicorn

First I will admit that there are a few to many keywords in that title, but I am indeed really trying to capture the problem in the correct manner. The issue here is that I can not seem to be able to correctly create a sub process using the python multiprocessing module without it causing the webpage response to hang. I have tried a few recent versions of gunicorn and the problem persists. Interestingly, the problem never was an issue on ubuntu server, but now moving the application to rhel6.5 this issue has presented itself. Here is the workflow:

-route is hit -form is submitted which hits the route and triggers a multiprocessing.Process() to be created, where the work done is to sleep for 30 seconds -the route appears to finish, as a print statement after the multiprocessing call is printed, however the browser keeps the connection open and does not 'finish loading' (show the page) until the 30 seconds of sleep are finished

Note that the form submission is not part of this issue, it just helps in viewing the issue happen.

Here is a very simple route and function that produces the issue:

def multi():
    print 'begin multi'
    time.sleep(30)
    print 'end multi'

@app.route('/multiprocesstest', methods=['GET','POST'])
def multiprocesstest():

    syntaxForm = forms.mainSyntaxForm()

    if syntaxForm.validate_on_submit():
        print 'before multi call'
        th = multiprocessing.Process(target=multi)
        th.start()
        print 'after multi call'
        return redirect('multiprocesstest')

    return render_template('syntax_main.html', form=syntaxForm)

After extended research and sparse google results for this problem, I have not found anything conclusive. I am going to try another load balancer to check is the problem is gunicorn only.

like image 265
Rboreal_Frippery Avatar asked Mar 20 '15 20:03

Rboreal_Frippery


People also ask

Does Gunicorn use multiprocessing?

Multiprocessing with Gunicorn It provides 2 URL: the first one (/gen) creates 5 new instances of the C class. The second (/test) returns the ID of the current process and the number of instances in the quadstore.

Does multiprocessing speed up Python?

It is used to significantly speed up your program, especially if it has a lot of CPU extensive tasks. In this case, multiple functions can run together because each one will use a different CPU core which in turn will improve the CPU utilization.

How does Python handle multiprocessing?

While using multiprocessing in Python, Pipes acts as the communication channel. Pipes are helpful when you want to initiate communication between multiple processes. They return two connection objects, one for each end of the Pipe, and use the send() & recv() methods to communicate.

Does multiprocessing in Python use multiple cores?

Python processes typically use a single thread because of the GIL. Despite the GIL, libraries that perform computationally heavy tasks like numpy, scipy and pytorch utilise C-based implementations under the hood, allowing the use of multiple cores.


1 Answers

Replace multiprocessing by multiprocessing.dummy might solve the issue since both gunicorn and multiprocessing are multiprocessing module and it might cause trouble when you try to invoke multiple processes inside of a single process.

like image 74
何俊烽 Avatar answered Sep 25 '22 02:09

何俊烽