Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Programmatically reduce log in a spark shell

Can I programmatically reduce the log in a spark shell by removing all the "INFO" flags? It is spamming my window and I can't analyze the real output For example:

15/01/09 12:23:02 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 649 bytes result sent to driver
15/01/09 12:23:02 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 500 ms on localhost (1/1)
15/01/09 12:23:02 INFO DAGScheduler: Stage 0 (count at MainApp.scala:31) finished in 0.520 s
15/01/09 12:23:02 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
15/01/09 12:23:02 INFO DAGScheduler: Job 0 finished: count at MainApp.scala:31, took 0.639191 s

Looking for any alternatives if you have some!

like image 603
Stephane Maarek Avatar asked Feb 23 '26 19:02

Stephane Maarek


1 Answers

This removes most info messages (not all though)

import org.apache.log4j.{Level, Logger}
// ...
val level = Level.WARN
Logger.getLogger("org").setLevel(level)
Logger.getLogger("akka").setLevel(level)

Or as util methods:

def setLogLevel(level: String): Unit = {
  setLogLevel(Level.toLevel(level, Level.INFO))
}

def setLogLevel(level: Level): Unit = {
  Logger.getLogger("org").setLevel(level)
  Logger.getLogger("akka").setLevel(level)
}
like image 75
Tobber Avatar answered Feb 25 '26 09:02

Tobber



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!