I'm a beginner with Apache NiFi, but until now All the tutorial that I read speak about the integration of kafka with Nifi. how it kafka is the complementary of Nifi? why we don't use Nifi directly to pusblish our message without the using of kafka?
Note: All tutorial that I seen does not speak about this point.
With throughput speed, data alteration, and data compressions, Apache NiFi carries an edge over Kafka. So, when you are looking for lightning speed, data modulation, and enhanced security, you should opt for Apache NiFi as opposed to Kafka.
To leverage the highest efficiency of Apache Nifi, you must utilize it as a Kafka producer. This will generate data from any source as an input, which is then forwarded to the Kafka Broker. Basically, Nifi is the replacement for the producer, which delivers data packets to fitting Kafka topics.
What Apache NiFi Does. Apache NiFi is an integrated data logistics platform for automating the movement of data between disparate systems. It provides real-time control that makes it easy to manage the movement of data between any source and any destination.
NiFi and Kafka complements in the sense that NiFi is not a messaging queue like Apache Kafka. On the contrary, Apache NiFi is a data-flow management aka data logistics tool.
Let's assume this scenario: You have messages (in JSON format) getting streamed through Kafka and you want to validate the messages to check if the message has all the fields and if they are valid, you want the messages to land in HBase.
Here NiFi can help you with the following approach:
ConsumeKafka
processors which you can configure with your Kafka broker and the group name.ValidateRecord
to check if the received messages are all validPutHBaseRecord
Summarizing, NiFi basically prevents you from writing a lot of boilerplate code. In this case, a custom logic to do schema validation and writing to HBase.
Found an interesting answer on Horthonworks community questions, I share it here for the sake of completeness:
Apache NiFi and Apache Kafka are two different tools with different use-cases that may slightly overlap. Here is my understanding of the purpose of the two projects.
NiFi is "An easy to use, powerful, and reliable system to process and distribute data."
It is a visual tool (with a REST api) that implements flow-based programming to enable the user to craft flows that will take data from a large variety of different sources, perform enrichment, routing, etc on the data as it's being processed, and output the result to a large variety of destinations. During this process, it captures metadata (provenance) on what has happened to each piece of data (FlowFile) as it made its way through the Flow for audit logging and troubleshooting purposes.
"Apache Kafka is publish-subscribe messaging rethought as a distributed commit log"
It is a distributed implementation of the publish-subscribe pattern that allows developers to connect programs to each other in different languages and across a large number of machines. It is more of a building block for distributed computing than it is an all-in-one solution for processing data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With