I am trying to suppress the message
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
when i run my Spark app. I've redirected the INFO messages successfully, however this message keeps on showing up. Any ideas would be greatly appreciated.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel (newLevel).
Spark uses log4j as logging facility. The default configuration is to write all logs into standard error, which is fine for batch jobs. But for streaming jobs, we’d better use rolling-file appender, to cut log files by size and keep only several recent files.
We believe that log4j strikes the right balance. Spark uses log4j as logging facility. The default configuration is to write all logs into standard error, which is fine for batch jobs. But for streaming jobs, we’d better use rolling-file appender, to cut log files by size and keep only several recent files.
Here SPARK_HOME is the root directory of your spark installation. Some may be using hdfs as their Spark storage backend and will find the logging messages are actually generated by hdfs. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties file. Simply change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console.
Okay, So I've figured out a way to do this. So basically, I had my own log4j.xml initially, that was being used, and hence we were seeing this property. Once I had my own "log4j.properties" file, this message went away.
Even simpler you just cd SPARK_HOME/conf
then mv log4j.properties.template log4j.properties
then open log4j.properties
and change all INFO
to ERROR
. Here SPARK_HOME
is the root directory of your spark installation.
Some may be using hdfs
as their Spark storage backend and will find the logging messages are actually generated by hdfs
. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties
file. Simply change hadoop.root.logger=INFO,console
to hadoop.root.logger=ERROR,console
. Once again HADOOP_HOME
is the root of your hadoop installation for me this was /usr/local/hadoop
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With