I have worked on some Kafka stream application and Kafka consumer application. In the end, Kafka stream is nothing but consumer which consumes real-time events from Kafka. So I am not able to figure out when to use Kafka streams or why we should use Kafka streams as we can perform all transformation on the consumer end.
I want to understand the main difference between Kafka stream and Kafka consumer as implementation wise and how to make a decision about what we should use in different use cases.
Thanks in advance for answers.
It's a question about "easy of use" (or simplicity) and "flexibility". The two "killer features" of Kafka Streams, compared to plain consumer/producer are:
Building a stateful, fault-tolerant application or using Kafka transactions with plain consumers/producers is quite difficult to get right. Furthermore, the higher level DSL provides a lot of built-in operators that are hard to build from scratch, especially:
Another nice feature is punctuations.
However, even if you build a simple stateless application, using Kafka Streams can help you significantly to reduce you code base (ie, avoid boilerplate code). Hence, the recommendation is, to use Kafka Streams when possible and only fall back to consumer/producer if Kafka Streams is not flexible enough for your use case.
It's different ways to do the same thing, with different levels of abstraction and functionality.
Here's a side-by-side comparison of doing the same thing (splitting a string into two separate fields) in Kafka vs in Kafka Streams (for good measure it shows doing it in ksqlDB too)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With