I would like to customize the Log4J configuration for my application in a standalone Spark cluster. I have a log4j.xml file which is inside my application JAR. What is the correct way to get Spark to use that configuration instead of its own Log4J configuration?
I tried using the --conf options to set the following, but no luck.
spark.executor.extraJavaOptions -> -Dlog4j.configuration=log4j.xml
spark.driver.extraJavaOptions -> -Dlog4j.configuration=log4j.xml
I am using Spark 1.4.1 and there's no log4j.properties file in my /conf.
Log4j in Apache SparkSpark uses log4j as the standard library for its own logging.
There is a vulnerability in Apache log4j used by Spark and Zookeeper that is affecting QRadar User Behavior Analytics(UBA). This has been addressed in both dependencies and UBA has been updated to the patched versions.
example-spark/src/main/resources/log4j. properties.
If you are using SBT as package manager/builder:
There is a log4j.properties.template
in $SPARK_HOME/conf
src/main/resource
.template
suffixWorks for me, and will probably include similar steps for other package managers, e.g. maven.
Try using driver-java-options
. For example:
spark-submit --class my.class --master spark://myhost:7077 --driver-java-options "-Dlog4j.configuration=file:///opt/apps/conf/my.log4j.properties" my.jar
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With