So thanks to easily googleable blogs I tried:
import org.specs2.mutable.Specification class SparkEngineSpecs extends Specification { sequential def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = loggers.map(loggerName => { val logger = Logger.getLogger(loggerName) val prevLevel = logger.getLevel logger.setLevel(level) loggerName -> prevLevel }).toMap setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka")) val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine")) // ... my unit tests
But unfortunately it doesn't work, I still get a lot of spark output, e.g.:
14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216) 14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4 14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4 14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4
You can disable LogLog's output by calling LogLog. setQuietMode(true) .
Yes, we should test logging when the logging is doing something that is required. For example, you have hooks in some external application that scans the log for certain events. In that case you certainly want to ensure the logging is done.
Add the following code into the log4j.properties
file inside the src/test/resources
dir, create the file/dir if not exist
# Change this to set Spark log level log4j.logger.org.apache.spark=WARN # Silence akka remoting log4j.logger.Remoting=WARN # Ignore messages below warning level from Jetty, because it's a bit verbose log4j.logger.org.eclipse.jetty=WARN
When I run my unit tests (I'm using JUnit and Maven), I only receive WARN level logs, in other words no more cluttering with INFO level logs (though they can be useful at times for debugging).
I hope this helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With