This question has probably been asked, and more than likely answered, but I don't know where to find it here.
Problem: I have a router for python's flask, that takes a while to process data for each call. I need to make each of the calls to the routes be a thread in itself so it doesn't have to wait for the requests to be loaded.
Modern web servers like Flask, Django, and Tornado are all able to handle multiple requests simultaneously. The concept of multitasking is actually very vague due to its various interpretations. You can perform multitasking using multiprocessing, multithreading, or asyncio.
How many concurrent requests can Flask handle? Flask will process one request per thread at the same time. If you have 2 processes with 4 threads each, that's 8 concurrent requests. Flask doesn't spawn or manage threads or processes.
Python's Global Interpreter Lock (GIL) only allows one thread to be run at a time under the interpreter, which means you can't enjoy the performance benefit of multithreading if the Python interpreter is required. This is what gives multiprocessing an upper hand over threading in Python.
multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.
Flask
comes with a built-in development web server, but you shouldn't be using it in production.
To get cool features like separate processes for each request and static file serving, you need to run an actual web service and a WSGI service in front of your Flask
application.
The Flask
docs provide several examples on how to set that up. Popular Web Server/WSGI combinations are Apache/mod_wsgi and Nginx/Gunicorn, but there are many other options.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With