I am using apache kafka for messaging. I have implemented the producer and consumer in Java. How can we get the number of messages in a topic?
Run kcat to count the messages You can count the number of messages in a Kafka topic simply by consuming the entire topic and counting how many messages are read. To do this from the commandline you can use the kcat tool which can act as a consumer (and producer) and is built around the Unix philosophy of pipelines.
As Martbob very helpfully mentioned, you can do this using kafka-log-dirs. This produces JSON output (on one of the lines). So I can use the ever-so-useful jq tool to pull out the 'size' fields (some are null), select only the ones that are numbers, group them into an array, and then add them together.
The Kafka max message size is 1MB. In this lesson we will look at two approaches for handling larger messages in Kafka. Kafka has a default limit of 1MB per message in the topic.
In the tab “Properties” you can choose Key and message content type. If you added Avro plugin, as was described above you'll see Avro type as well. Choose the type used for your messages and go to the “Data” tab. Now you can see all messages that are in the selected topic.
It is not java, but may be useful
./bin/kafka-run-class.sh kafka.tools.GetOffsetShell \ --broker-list <broker>:<port> \ --topic <topic-name> \ | awk -F ":" '{sum += $3} END {print sum}'
The only way that comes to mind for this from a consumer point of view is to actually consume the messages and count them then.
The Kafka broker exposes JMX counters for number of messages received since start-up but you cannot know how many of them have been purged already.
In most common scenarios, messages in Kafka is best seen as an infinite stream and getting a discrete value of how many that is currently being kept on disk is not relevant. Furthermore things get more complicated when dealing with a cluster of brokers which all have a subset of the messages in a topic.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With