Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Kafka consumer error Cancelled in-flight API_VERSIONS request with correlation id 1 due to node -1 being disconnected

I have the consumer config as follows

package com.example.kafka.config;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;

import java.util.HashMap;
import java.util.Map;

@EnableKafka
@Configuration
public class KafkaConsumerConfig {
    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:2181");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "group-tenent1-id");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }
    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, String>
                factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }

}

However, when the application is started, I am seeing the following keeps on outputting

Cancelled in-flight API_VERSIONS request with correlation id 1 due to node -1 being disconnected

I was able to send message to a Kafka topic using the following though

kafkaTemplate.send("test-topic", msg);

The consumer listener is as follows

@Service
public class Receiver {
    @KafkaListener(topics = "test-topic", groupId = "group-tenent1-id")
    public void listen(String message) {
        log.info("Received Messasge in group - group-id: " + message);
    }
}
package com.example.kafka.config;

import java.util.HashMap;
import java.util.Map;

import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;

@Configuration
public class KafkaProducerConfig {
    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return new DefaultKafkaProducerFactory<>(configProps);
    }
    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

But I am unable to the logging that a message was received

enter image description here

like image 659
Tom Avatar asked Dec 05 '25 15:12

Tom


2 Answers

I was getting this error message for the consumer with Confluent Cloud Kafka but not with my local Kafka cluster. None of the above solutions applied. I suspected security settings but my application.yml included them all - and correctly. Ultimately found the fix was to put security settings for Confluent in the Kafka Consumer Config such as shown here (your security settings may vary):

kafkaConfigProperties.put("ssl.endpoint.identification.algorithm", "https"); 
kafkaConfigProperties.put("sasl.mechanism", "PLAIN"); 
kafkaConfigProperties.put("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"" + apiKey + "\" password=\"" + apiPassword + "\";"); 
kafkaConfigProperties.put("security.protocol", "SASL_SSL");
like image 154
Don Avatar answered Dec 07 '25 05:12

Don


In my case i was using aws msk bootstrap servers with IAM. I haven't add the proper security config for security.protocol and sasl.mechanism and after changing it, my app worked without any issue.

Previously it was like below

  security.protocol: PLAINTEXT
  sasl.mechanism: PLAIN

And then I changed it to

  security.protocol: SASL_SSL
  sasl.mechanism: AWS_MSK_IAM
like image 45
Dushan Avatar answered Dec 07 '25 04:12

Dushan



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!