Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get rid of "Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties" message?

I am trying to suppress the message

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

when i run my Spark app. I've redirected the INFO messages successfully, however this message keeps on showing up. Any ideas would be greatly appreciated.

like image 985
Seagull Avatar asked Jun 18 '15 20:06

Seagull


People also ask

How do I change the default log level in spark?

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel (newLevel).

What is the best way to log logs in Apache Spark?

Spark uses log4j as logging facility. The default configuration is to write all logs into standard error, which is fine for batch jobs. But for streaming jobs, we’d better use rolling-file appender, to cut log files by size and keep only several recent files.

Should we use Log4j for streaming jobs in spark?

We believe that log4j strikes the right balance. Spark uses log4j as logging facility. The default configuration is to write all logs into standard error, which is fine for batch jobs. But for streaming jobs, we’d better use rolling-file appender, to cut log files by size and keep only several recent files.

Where do I find spark_home log files?

Here SPARK_HOME is the root directory of your spark installation. Some may be using hdfs as their Spark storage backend and will find the logging messages are actually generated by hdfs. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties file. Simply change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console.


2 Answers

Okay, So I've figured out a way to do this. So basically, I had my own log4j.xml initially, that was being used, and hence we were seeing this property. Once I had my own "log4j.properties" file, this message went away.

like image 68
Seagull Avatar answered Oct 13 '22 02:10

Seagull


Even simpler you just cd SPARK_HOME/conf then mv log4j.properties.template log4j.properties then open log4j.properties and change all INFO to ERROR. Here SPARK_HOME is the root directory of your spark installation.

Some may be using hdfs as their Spark storage backend and will find the logging messages are actually generated by hdfs. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties file. Simply change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console. Once again HADOOP_HOME is the root of your hadoop installation for me this was /usr/local/hadoop.

like image 40
quine Avatar answered Oct 13 '22 03:10

quine