I'm writing a small web server for testing purposes using python, BasicHTTPServer and SimpleHTTPServer. It looks like it's processing one request at a time. Is there any way to make it a little faster without messing around too deeply? Basicly my code looks as the following and I'd like to keep it this simple ;)
os.chdir(webroot)
httpd = BaseHTTPServer.HTTPServer(("", port), SimpleHTTPServer.SimpleHTTPRequestHandler)
print("Serving directory %s on port %i" %(webroot, port) )
try:
httpd.serve_forever()
except KeyboardInterrupt:
print("Server stopped.")
You can make your own threading or forking class with a mixin inheritance from SocketServer:
import SocketServer
import BaseHTTPServer
class ThreadingHTTPServer(SocketServer.ThreadingMixIn, BaseHTTPServer.HTTPServer):
pass
This has its limits as it doesn't use a thread pool, is limited by the GIT, etc, but it could help a little (with relatively little effort). Remember that requests will be served simultaneously by multiple threads, so be sure to put proper locking around accesses to global/shared data (unless such data's immutable after startup) done in the course of serving a request.
This SO question covers the same ground (not particularly at length).
You might also look at CherryPy -- it's pretty simple, too, and has multiple request threads with no additional effort. Although your needs may be modest now, CP has a lot of nice capabilities that may benefit you in the future.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With