How to Connect with Logstash via Apache Kafka? the question how to get it out of Kafka and into something like Elasticsearch inevitably comes up. Does any one has a tuto to do this?
Thank you
Kafka is much more powerful than Logstash. For syncing data from such as PostgreSQL to ElasticSearch, Kafka connectors could do the similar work with Logstash. One key difference is: Kafka is a cluster, while Logstash is basically single instance. You could run multiple Logstash instances.
Kafka expects JSON in a certain format If you are trying to deserialize plain JSON data, set schemas. enable=false in your converter configuration. Consume the topic with keys and check the format. Kafka is expecting something like the following: {“schema”: {“value”:“type”} “payload”: {“key”:“value”}}.
Logstash has an input plugin for kafka. First of all, you should getting familiar with apache kafka and his producer/consumer paradigm: https://kafka.apache.org/. Then getting started with Logstash: https://www.elastic.co/products/logstash. After all of this, you will be able to use kafka input plugin for logstash: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html. The last step is build a logstash pipeline to insert data into a destination like Elasticsearch. This simple example can help you to achieve your goal:
logstash.conf
input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["example-topic"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "example-index"
}
}
Here we simply take data coming from kafka queue on a specific topic. Then we store the data into an elasticsearch index. Hope that helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With