Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Processing Simultaneous/Asynchronous Requests with Python BaseHTTPServer

I've set up a threaded (with Python threads) HTTP server by creating a class that inherits from HTTPServer and ThreadingMixIn:

class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
    pass

I have a handler class which inherits from BaseHTTPRequestHandler, and I start the server with something like this:

class MyHandler(BaseHTTPRequestHandler):
    ...

server = ThreadedHTTPServer(('localhost', 8080), MyHandler)
# Prevent issues with socket reuse
server.allow_reuse_address = True
# Start the server
server.serve_forever()

This is all pretty straightforward. The problem that I'm encountering is that, ThreadingMixIn, ForkingMixIn, or otherwise, the request winds up blocking on the request handler to return. This can easily be seen by implementing this example code:

class MyHandler(BaseHTTPRequestHandler):
    def respond(self, status_code):
        self.send_response(status_code)
        self.end_headers()

    def do_GET(self):
         print "Entered GET request handler"
         time.sleep(10)
         print "Sending response!"
         respond(200)

If the server were processing these simultaneously, then we would be able to send two requests and see the server enter both GET request handlers before sending either response. Instead, the server will enter the GET request handler for the first request, wait for it to return, then enter it for the second (so the second request takes ~20 seconds to return instead of 10).

Is there a straightforward way for me to implement a system where the server doesn't wait on the handler to return? Specifically, I'm trying to write a system which waits to receive several requests before returning any of them (a form of long polling) and running into issues where the first request waiting blocks any future requests from connecting to the server.

like image 278
Dylnuge Avatar asked Sep 29 '12 05:09

Dylnuge


1 Answers

class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
    pass

is enough. Your client probably don't make concurrent requests. If you make the requests in parallel the threaded server works as expected. Here's the client:

#!/usr/bin/env python
import sys
import urllib2

from threading import Thread

def make_request(url):
    print urllib2.urlopen(url).read()

def main():
    port = int(sys.argv[1]) if len(sys.argv) > 1 else 8000
    for _ in range(10):
        Thread(target=make_request, args=("http://localhost:%d" % port,)).start()

main()

And the corresponding server:

import time
from BaseHTTPServer   import BaseHTTPRequestHandler, HTTPServer, test as _test
from SocketServer     import ThreadingMixIn


class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
    pass

class SlowHandler(BaseHTTPRequestHandler):
    def do_GET(self):
        self.send_response(200)
        self.send_header("Content-type", "text/plain")
        self.end_headers()

        self.wfile.write("Entered GET request handler")
        time.sleep(1)
        self.wfile.write("Sending response!")

def test(HandlerClass = SlowHandler,
         ServerClass = ThreadedHTTPServer):
    _test(HandlerClass, ServerClass)


if __name__ == '__main__':
    test()

All 10 requests finish in 1 second. If you remove ThreadingMixIn from the server definition then all 10 requests take 10 seconds to complete.

like image 185
jfs Avatar answered Oct 09 '22 10:10

jfs