Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Kafka Consumer outputs excessive DEBUG statements to console (ecilpse)

I'm running some sample code from http://www.javaworld.com/article/3060078/big-data/big-data-messaging-with-kafka-part-1.html?page=2, and the kafkaconsumer is consuming from topic as desired, but every poll results in print (to std out) of many debug logs, which I don't want.

I have tried changing all INFO and DEBUG to ERROR (even did a grep to make sure) in /config/log4j.properties, in particular setting log4j.logger.kafka=ERROR, kafkaAppender, but the problem persists. I referred to How to configure logging for Kafka producers?, and adopted the solution there, but perhaps the situation is different for consumers?

The DEBUG messages all have a similar format:

[Thread-0] DEBUG org.apache.kafka.clients.consumer.internals.Fetcher - Sending fetch for partitions... to broker... (id: 0 rack: null)

and are appearing at rate of 10 every second or so (changing poll argument to 1000 or even 10000 doesn't help, I tried)

Would really appreciate any help from any expert. Thanks in advance!

Edit: Not sure if it matters, but I added BasicConfigurator.configure(); to my main method, to resolve some other error occurring previously that stopped the Consumer from even starting.

like image 893
Bay Wei Heng Avatar asked May 23 '17 08:05

Bay Wei Heng


People also ask

How do I stop consuming messages from Kafka topic?

Stopping a Kafka Consumer We can use rest api to stop a running consumer. However, we need consumer id to stop the running consumer, so the consumer id needs to be sent. Then try to access the POST http://localhost:8080/api/kafka/registry/deactivate by sending the id parameter of the consumer you want to stop.

Is Kafka using log4j?

Kafka Connect and other Confluent Platform components use the Java-based logging utility Apache Log4j to collect runtime data and record component events.

How do I stop Kafka logs?

GC Logging for both of these services is enabled using EXTRA_ARGS variable. Kafka service. ZooKeeper service. Alter EXTRA_ARGS for the Kafka service variable to disable GC Logging.

What happens if Kafka consumer fails?

If the consumer fails after writing the data to the database but before saving the offsets back to Kafka, it will reprocess the same records next time it runs and save them to the database once more.


2 Answers

create new config xml file

src/main/resources/logback.xml

<configuration>
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>
    <logger name="org.apache.kafka" level="WARN"/>
    <logger name="org.apache.kafka.common.metrics" level="WARN"/>
    <root level="warn">
        <appender-ref ref="STDOUT" />
    </root>
</configuration>
like image 174
Zouinkhi Avatar answered Oct 10 '22 20:10

Zouinkhi


Just modify the logging level of the chatty class (chatty interaction). Since in your logs you see log entries originating from org.apache.kafka.clients.consumer.internals.Fetcher you can simply adjust the logging level for that logger by adding following line to log4j.properties:

log4j.logger.org.apache.kafka.clients.consumer.internals.Fetcher=WARN

... or any wider catching logger since these are name spaced:

# adjusting logging for entire Kafka
log4j.logger.org.apache.kafka=WARN

Hope this helps

like image 42
diginoise Avatar answered Oct 10 '22 21:10

diginoise