I understand both asynchronous and multithreaded programming and I have done both and can do them with ease. However one thing still bugs me: why is the general consensus that async is better performing than multithreading? (added: I'm talking about the case where either approach is viable and you get to make a choice)
At the first glance the reason seems clear - less threads, less work for the OS scheduler, less memory wasted for stack space. But... I don't feel like these arguments hold water. Let's look at them individually:
Task
objects allocated on the heap essentially for every stack frame which contain state and local variables and callback references and everything. Plus it's fragmented all over the address space which gives even more headaches to the CPU cache, because pre-fetching will be useless. So... which elephant in the room have I missed?
Async methods don't require multithreading because an async method doesn't run on its own thread. The method runs on the current synchronization context and uses time on the thread only when the method is active. You can use Task.
So why asyncio is faster than multi-threading if they both belong to asynchronous programming? It's because asyncio is more robust with task scheduling and provides the user with full control of code execution.
Asynchronous programming is about the asynchronous sequence of Tasks, while multithreading is about multiple threads running in parallel. Multithreading is a way of asynchrony in programming but we can also have single-threaded asynchronous tasks. The best way to see the difference is with an example.
Asynchronous Code is Faster than Synchronous The code can easily be rewritten in asynchronous way because SaveChanges has asynchronous implementation called SaveChangesAsync. Asynchronous programming allows us to unblock the thread after it initiated the operation like a database call, so it can serve other requests.
why is the general consensus that async is better performing than multithreading? (added: I'm talking about the case where either approach is viable and you get to make a choice)
On the server side, async
lets you make maximum use of threads. Why have one thread handle a single connection when it can handle hundreds? On the server side, it's not an "async vs threads" scenario - it's an "async and threads" scenario.
On the client side - where either approach is truly viable - it doesn't matter as much. So what if you spin up an extra unnecessary thread? It's just not that big of a deal, even for mobile apps these days. While technically, async
can help be more efficient especially in a memory- and battery-constrained device, at this point in history it's not that terribly important. However, even on the client side, async
has a tremendous benefit in that it allows you to write serial code rather than mucking around with callbacks.
There are still N tasks running in parallel, SOMEBODY has to switch between them.
No. I/O tasks as used by async
do not "run" anywhere, and do not need to be "switched" to. On Windows, I/O tasks use IOCPs underneath, and I/O tasks do not "run" - they only "complete", which happens as the result of a system interrupt. More info in my blog post "There Is No Thread".
Where then does the efficiency come from?
The word "efficiency" is tricky. For example, an asynchronous HTTP server handler will actually respond more slowly than a synchronous handler. There's overhead to setting up the whole async thing with callbacks, etc. However, that slowdown AFAICT is unmeasurably small, and asynchronous code allows that server to handle more simultaneous requests than a synchronous server ever could (in real-world tests, we're talking 10x as a conservative estimate). Furthermore, asynchronous code is not limited by the thread injection rate of the thread pool, so asynchronous server code responds faster to sudden changes in load, reducing the number of request timeouts as compared to a synchronous server in the same scenario. Again, this is due to "async and threads", not "async instead of threads".
A few years ago, Node.js was heralded as an incredibly efficient server - based on real-world measurements. At the time, most ASP.NET apps were synchronous (writing asynchronous apps was quite hard before async
, and companies knew it was cheaper to just pay for more server hardware). Node.js, in fact, only has one server thread that ever runs your app. It was 100% asynchronous, and that's where it got its scalability benefits from. ASP.NET took note of this, and ASP.NET Core (among other changes) made its entire stack asynchronous.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With