I am looking for an analogy that will help me understand the difference between how a thread-based sever handles http requests and how an event-based server handles http requests. Let's say that a server is a shop in a building, port 80 is the shop's front door, and an http request is a customer who just walked in the front door. What happens next? How does shop handle the customer? How does the shop handle several customers, and what difference does it make in terms of how fast a customer leaves the shop?
In short, I'm looking for an explanation of things like 'event loop' and 'thread' and "blocking" and "non-blocking" in terms of a physical, real-world analogy.
In the thread-based server analogy, each customer is served by their own shop employee. When the customer leaves, the shop employee can help another customer. The number of employees that can be helped simultaneously is directly tied to the number of employees at the store.
In the event-based server analogy, multiple customers might be served by a single shop employee – let's call him Bob. Bob delegates various steps that might take a while (like "find me item x
in the back room") to other store employees. When Bob asks a helper for help, the helper scurries off to somewhere else in the store, and Bob can move on to help other customers while the original customer waits around for the helper to come back to Bob. When the helper does return, having finished their task, they will wait for Bob to come to a good stopping point with Bob's current customer, and then Bob can talk to the helper and original customer again.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With