Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it better to keep a Kafka Producer open or to create a new one for each message?

I have data coming in through RabbitMQ. The data is coming in constantly, multiple messages per second. I need to forward that data to Kafka.

In my RabbitMQ delivery callback where I am getting the data from RabbitMQ I have a Kafka producer that immediately sends the recevied messages to Kafka. My question is very simple. Is it better to create a Kafka producer outside of the callback method and use that one producer for all messages or should I create the producer inside the callback method and close it after the message is sent, which means that I am creating a new producer for each message?

It might be a naive question but I am new to Kafka and so far I did not find a definitive answer on the internet.

EDIT : I am using a Java Kafka client.

like image 347
mirzaD14 Avatar asked Nov 07 '25 11:11

mirzaD14


2 Answers

Creating a Kafka producer is an expensive operation, so using Kafka producer as a singleton will be a good practice considering performance and utilizing resources.

For Java clients, this is from the docs:

The producer is thread safe and should generally be shared among all threads for best performance.

For librdkafka based clients (confluent-dotnet, confluent-python etc.), I can link this related issue with this quote from the issue:

Yes, creating a singleton service like that is a good pattern. you definitely should not create a producer each time you want to produce a message - it is approximately 500,000 times less efficient.

like image 191
ndogac Avatar answered Nov 10 '25 16:11

ndogac


Kafka producer is stateful. It contains meta info(periodical synced from brokers), send message buffer etc. So create producer for each message is impracticable.

like image 31
louxiu Avatar answered Nov 10 '25 16:11

louxiu



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!