I have an issue with kafka producer in production where I see below error.
"Publish failed, Expiring 2 record(s) for myTopic-1:120004 ms has passed since batch creation[[ org.apache.kafka.common.errors.TimeoutException: Expiring 2 record(s) for myTopic-1:120004 ms has passed since batch creation"
Kafka brokers are of confluent 5.3.2 version and the kafka-client is apache 2.3.1. Producer config which are explicitly specified in my code are below and remaining are defaults.
batch.size = 102400 linger.ms = 100 compression.type = lz4 ack = all
Sample Java Code
ProducerRecord<String, String> rec = new ProducerRecord<String, String>("myTopic",1,"myKey","json-payload-here");
producer.send(rec, new ProducerCallback(jsonPayload));
private class ProducerCallback implements Callback {
private String _ME ="onCompletion";
private String jsonPayload;
public ProducerCallback(String jsonPayload) {
this.jsonPayload = jsonPayload;
}
@Override
public void onCompletion(RecordMetadata recordMetadata, Exception e) {
if (e == null) {
LOG.logp(Level.FINEST, _CL, _ME, "Published kafka event "+jsonPayload);
} else {
//Note: Exception is logged here.
LOG.log(Level.SEVERE, "Publish failed, "+e.getMessage(), e);
}
}
}
Couple of questions
Thanks, appreciate your help in advance.
The batch expires, so no, cannot get the data back unless you saved the data in some other data structure.
To actually send the batch, can lower the batch size or your can explicitly call producer.flush()
. To increase the duration of the timeout, use request.timeout.ms
.
In my case it was batch size which resulted in this error. The batch size we used was 10Mb , but once we made it 1Mb , the issue was resolved.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With