Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to prevent logging of pyspark 'answer received' and 'command to send' messages

I am using python logging with pyspark and pyspark DEBUG level messages are flooding my log file with the example shown. How do I prevent this from happening? A simple solution is to set log level to INFO, but I need to log my own python DEBUG level messages

2015-12-13 15:13:32 4906 DEBUG   : Command to send: j
i
rj
org.apache.spark.SparkConf
e

2015-12-13 15:13:32 4906 DEBUG   : Answer received: yv
2015-12-13 15:13:32 4906 DEBUG   : Command to send: j
i
rj
org.apache.spark.api.java.*
e

2015-12-13 15:13:32 4906 DEBUG   : Answer received: yv
2015-12-13 15:13:32 4906 DEBUG   : Command to send: j
i
rj
org.apache.spark.api.python.*
e
like image 377
Michael Avatar asked Dec 13 '15 07:12

Michael


People also ask

How do I use log4j in Pyspark?

Updating the configuration of Log4jAdd a file named log4j2. properties to $SPARK_HOME/conf . The code in Listing 1.1 is added to configure an appender that logs to stderr; any output to stdout and stderr is appended to Docker container logs. The last two lines set the format to JSON.

What are handlers in logging?

The handler compares the level that is specified in the logged object to the level that is specified for the handler. If the level of the logged object is less severe than the level set in the handler, the object is ignored by the handler. The default level for handlers is ALL. Handlers can have a filter.


1 Answers

You can set logging level for each logger separately

 pyspark_log = logging.getLogger('pyspark')
 pyspark_log.setLevel(logging.ERROR)
like image 192
Andy Avatar answered Sep 18 '22 02:09

Andy