I am using aiohttp
to create an Async/IO webserver. However, to my understanding, Async/IO means the server can only run on one processing core. Regular, synchronous servers like uwsgi
, on the other hand, can fully utilize the computer's computing resources with truly parallel threads and processes. Why, then, is Async/IO new and trendy if it less parallel than multiprocessing? Can async servers like aiohttp
be multi-processed?
Why, then, is Async/IO new and trendy if it less parallel than multiprocessing?
The two solve different problems. Asyncio allows writing asynchronous code sans the "callback hell". await
allows the use of constructs like loops, ifs, try/except, and so on, with automatic task switching at await
points. This enables servicing a large number of connections without needing to spawn a thread per connection, but with maintainable code that looks as if it were written for blocking connections. Thus asyncio mostly helps with the code whose only bottleneck is waiting for external events, such as network IO and timeouts.
Multiprocessing, on the other hand, is about parallelizing execution of CPU-bound code, such as scientific calculations. Since OS threads do not help due to the GIL, multiprocessing spawns separate OS processes and distributes the work among them. This comes at the cost of the processes not being able to easily share data - all communication is done either by serialization through pipes, or with dedicated proxies.
A multi-threaded asyncio-style framework is possible in theory - for example, Rust's tokio is like that - but would be unlikely to bring performance benefits due to Python's GIL preventing utilization of multiple cores. Combining asyncio and multiprocessing can work on asyncio code that doesn't depend on shared state, which is supported by asyncio through run_in_executor
and ProcessPoolExecutor
.
Gunicorn can help you:
gunicorn module:app --bind 0.0.0.0:8080 --worker-class aiohttp.GunicornWebWorker --workers 4
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With