I have an small app in scala that creates kafka producer and that run with Apache Spark. when I run the command
spark-submit --master local[2] --deploy-mode client <into the jar file> <app Name> <kafka broker> <kafka in queue> <kafka out queue> <interval>
I am getting this WARN: WARN AppInfoParser: Error registering AppInfo mbean javax.management.InstanceAlreadyExistsException: kafka.producer:type=app-info,id=
The code is not relevant because I am getting this exception when scala creates the KafkaProducer: val producer = new KafkaProducerObject,Object
Does anybody have a solution for this? Thank you!
When a Kafka Producer is created, it attempts to register an MBean using the client.id as its unique identifier.
There are two possibilities of why you are getting the InstanceAlreadyExistsException
warning:
close()
on an existing Producer before initializing another Producer. Calling close()
unregisters the MBean.If you leave the client.id property blank when initializing the producer, a unique one will be created for you. Giving your producers unique client.id values or allowing them to be auto-generated would resolve this problem.
In the case of Kafka, MBeans can be used for tracking statistics.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With