I'm running some sample code from http://www.javaworld.com/article/3060078/big-data/big-data-messaging-with-kafka-part-1.html?page=2, and the kafkaconsumer is consuming from topic as desired, but every poll results in print (to std out) of many debug logs, which I don't want.
I have tried changing all INFO and DEBUG to ERROR (even did a grep to make sure) in /config/log4j.properties
, in particular setting log4j.logger.kafka=ERROR
, kafkaAppender, but the problem persists. I referred to How to configure logging for Kafka producers?, and adopted the solution there, but perhaps the situation is different for consumers?
The DEBUG messages all have a similar format:
[Thread-0] DEBUG org.apache.kafka.clients.consumer.internals.Fetcher - Sending fetch for partitions... to broker... (id: 0 rack: null)
and are appearing at rate of 10 every second or so (changing poll argument to 1000 or even 10000 doesn't help, I tried)
Would really appreciate any help from any expert. Thanks in advance!
Edit: Not sure if it matters, but I added BasicConfigurator.configure();
to my main method, to resolve some other error occurring previously that stopped the Consumer from even starting.
Stopping a Kafka Consumer We can use rest api to stop a running consumer. However, we need consumer id to stop the running consumer, so the consumer id needs to be sent. Then try to access the POST http://localhost:8080/api/kafka/registry/deactivate by sending the id parameter of the consumer you want to stop.
Kafka Connect and other Confluent Platform components use the Java-based logging utility Apache Log4j to collect runtime data and record component events.
GC Logging for both of these services is enabled using EXTRA_ARGS variable. Kafka service. ZooKeeper service. Alter EXTRA_ARGS for the Kafka service variable to disable GC Logging.
If the consumer fails after writing the data to the database but before saving the offsets back to Kafka, it will reprocess the same records next time it runs and save them to the database once more.
create new config xml file
src/main/resources/logback.xml
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.apache.kafka" level="WARN"/>
<logger name="org.apache.kafka.common.metrics" level="WARN"/>
<root level="warn">
<appender-ref ref="STDOUT" />
</root>
</configuration>
Just modify the logging level of the chatty class (chatty interaction).
Since in your logs you see log entries originating from org.apache.kafka.clients.consumer.internals.Fetcher
you can simply adjust the logging level for that logger by adding following line to log4j.properties
:
log4j.logger.org.apache.kafka.clients.consumer.internals.Fetcher=WARN
... or any wider catching logger since these are name spaced:
# adjusting logging for entire Kafka
log4j.logger.org.apache.kafka=WARN
Hope this helps
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With