I am a fresher in Big Data. I have database in MySQL and I don't know how to import this to Apache Kafka for streaming process. And then get data from consumer. Please give me advice.
This recipe helps you read data from MySQL and publish table data to Kafka Topic in NiFi. Apache NiFi is used as open-source software for automating and managing the data flow between systems in most big data scenarios. It is a robust and reliable system to process and distribute data.
When we speak of actual Kafka integrations with databases, usually we mean one of these two integrations types from the database side: As a Source – the data is read from source database tables and is written into target Kafka topics.
A Kafka Producer has a pool of buffer that holds to-be-sent records. The producer has background, I/O threads for turning records into request bytes and transmitting requests to Kafka cluster. The producer must be closed to not leak resources, i.e., connections, thread pools, buffers.
Regarding putting mysql data into kafka I'd suggest having a look at the kafka-connect jdbc connector.
Next, you have a flurry of stream processing frameworks, with their own benefits and drawbacks, to choose from in order to do computation on your streaming data stored in kafka:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With