For Metrics we meed to see the total size of a Kafka Topic in bytes across all partitions and brokers.
I have been searching for quite a while on how to do this and I haven't worked out if this is possible and how to do it.
We are on V0.82 of Kafka.
'kafka-log-dirs --describe --bootstrap-server kafka:9092' will return state of all topics/partitions, '--topic-list' will narrow down that list.
And then I discovered the kafka-log-dirs tool. This tool available on the bin folder of Kafka lets you query the size occupied by each partition by broker. You can also specify specific topics to be queried using the --topic-list option. The size is the size occupied by the partition in bytes.
So for example, if you are generally sending in 200MB a day of messages to a single partition topic, and you want to keep them for 5 days you would set retention. bytes to 1GB (200MB x 5 days). If this was over 10 partitions then you would set retention. bytes = 100MB (1GB / 10 partitions).
You can use the bin/kafka-topics.sh shell script along with the Zookeeper service URL as well as the –list option to display a list of all the topics in the Kafka cluster. You can also pass the Kafka cluster URL to list all topics.
You can see the partition size using the script /bin/kafka-log-dirs.sh
/bin/kafka-log-dirs.sh --describe --bootstrap-server <KafakBrokerHost>:<KafakBrokerPort> --topic-list <YourTopic>
As Martbob very helpfully mentioned, you can do this using kafka-log-dirs. This produces JSON output (on one of the lines). So I can use the ever-so-useful jq
tool to pull out the 'size' fields (some are null), select only the ones that are numbers, group them into an array, and then add them together.
kafka-log-dirs \
--bootstrap-server 127.0.0.1:9092 \
--topic-list 'topic_of_interest' \
--describe \
| grep '^{' \
| jq '[ ..|.size? | numbers ] | add'
Example output: 67704
I haven't verified if the output makes sense, so you should check that yourself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With