Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to do logging with Spark in local mode?

Is it possible to do logging on certain path - which will be provided as input argument - in Apache Spark while working in local mode?

val conf = new SparkConf().
  setAppName("SparkProgrammingGuide").
  setMaster(master).
  set("spark.eventLog.enabled", "true").
  set("spark.eventLog.dir", "file:///home/USER")
val sc = new SparkContext(conf)
like image 644
user706838 Avatar asked Sep 30 '15 15:09

user706838


People also ask

How do I run a Spark job in local mode?

So, how do you run the spark in local mode? It is very simple. When we do not specify any --master flag to the command spark-shell, pyspark, spark-submit, or any other binary, it is running in local mode. Or we can specify --master option with local as argument which defaults to 1 thread.


1 Answers

This question was already answered in the comments and independently validated.

Original answer by @Rohan

you can enable event logging and path configuration through the SparkContext using the following property names: spark.eventLog.enabled and spark.eventLog.dir. The documentation for that can be found here.

Confirmation by @Yudovin

I have run spark in local mode with spark.eventLog.dir parameter and the file with logs has been created. Spark History Server can be used for viewing and analyzing these logs.

like image 58
Dennis Jaheruddin Avatar answered Oct 14 '22 11:10

Dennis Jaheruddin