Is it possible to do logging on certain path - which will be provided as input argument - in Apache Spark while working in local mode?
val conf = new SparkConf().
setAppName("SparkProgrammingGuide").
setMaster(master).
set("spark.eventLog.enabled", "true").
set("spark.eventLog.dir", "file:///home/USER")
val sc = new SparkContext(conf)
So, how do you run the spark in local mode? It is very simple. When we do not specify any --master flag to the command spark-shell, pyspark, spark-submit, or any other binary, it is running in local mode. Or we can specify --master option with local as argument which defaults to 1 thread.
This question was already answered in the comments and independently validated.
Original answer by @Rohan
you can enable event logging and path configuration through the SparkContext using the following property names:
spark.eventLog.enabled
andspark.eventLog.dir
. The documentation for that can be found here.
Confirmation by @Yudovin
I have run spark in local mode with
spark.eventLog.dir
parameter and the file with logs has been created. Spark History Server can be used for viewing and analyzing these logs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With