While using the disruptor, there may be a consumer(s) that is lagging behind, and because of that slow consumer, the whole application is affected.
Keeping in mind that every producer(Publisher) and consumer(EventProcessor) is running on a single thread each, what can be the solution to the slow consumer problem?
Can we use multiple threads on a single consumer? If not, what is a better alternative?
Generally speaking use a WorkerPool to allow multiple pooled worker threads to work on a single consumer, which is good if you have tasks that are independent and of a potentially variable duration (eg: some short tasks, some longer).
The other option is to have multiple independent workers parallel process over the events, but each worker only handle modulo N workers (eg 2 threads, and one thread processes odd, one thread processes even event IDs). This works great if you have consistent duration processing tasks, and allows batching to work very efficiently too.
Another thing to consider is that the consumer can do "batching", which is especially useful for example in auditing. If your consumer has 10 events waiting, rather than write 10 events to an audit log independently, you can collect all 10 events and write them at the same time. In my experience this more than covers the need to run multiple threads.
Try to separate slow part to other thread (I/O, not O(1) or O(log) calculations, etc.), or to apply some kind of back pressure when consumer is overloaded (by yielding or temporary parking of producers, replaying with 503 or 429 status codes, etc.): http://mechanical-sympathy.blogspot.com/2012/05/apply-back-pressure-when-overloaded.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With