i am trying to write logs as json in spark 2.0 over EMR. i was able to use custom log4j.properties file.
but when i tried to change the output to json with custom class (net.logstash.log4j.JSONEventLayoutV1), i get the following exception:
log4j:ERROR Could not instantiate class [net.logstash.log4j.JSONEventLayoutV1].
java.lang.ClassNotFoundException: net.logstash.log4j.JSONEventLayoutV1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:797)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:117)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.initializeLogIfNecessary(CoarseGrainedExecutorBackend.scala:161)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:161)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:172)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:270)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
here is how the log4j.properties, looks like:
log4j.rootCategory=INFO, json
log4j.appender.json=org.apache.log4j.ConsoleAppender
log4j.appender.json.target=System.err
log4j.appender.json.layout=net.logstash.log4j.JSONEventLayoutV1
The artifact "jsonevent-layout", was assembled in the fat-jar.
Does anyone has a clue how to solve this issue ?
thanks, Eran
eventually, this worked for me:
log4j.rootCategory=INFO, json
log4j.appender.json=org.apache.log4j.ConsoleAppender
log4j.appender.json.target=System.err
log4j.appender.json.layout=org.apache.hadoop.log.Log4Json
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With