Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to keep the Spark web UI alive?

Tags:

apache-spark

After the end of the execution of Spark submit, the Spark web UI is killed. Is there a way to keep it alive?

I am using Spark 1.2.1.

like image 743
poiuytrez Avatar asked Feb 23 '15 13:02

poiuytrez


3 Answers

To add a newbie friendly step-by-step solution of how to work with the history server:

  • In the spark distribution folder try to start the history server by:

    ./sbin/start-history-server.sh

    By default the history server will try to monitor /tmp/spark-events for logs and unfortunately it will crash if the path doesn't exist. So if you get an error you may have to mkdir /tmp/spark-events first. You can check the history server logs in ./logs to see details in case of trouble.

  • In order for a context to persists its event log, you have to enable event logging. This can be done either programmatically or by editing ./conf/spark-defaults.conf (copy the template if it doesn't exist yet), and uncommenting/adding the line:

    spark.eventLog.enabled true

    Running a spark-submit should result in event log folders like /tmp/spark-events/local-1465133722470.

  • Access the history server UI, typically from http://localhost:18080

like image 79
bluenote10 Avatar answered Oct 06 '22 08:10

bluenote10


If you are testing in local mode i.e. using IDEA or Eclipse, one way is to do is as following.

System.in.read();
spark.stop(); // spark --> SparkSession 

This will ensure that the UI is accessible as long as you want. Just hit enter on the IDEA/Eclipse console to terminate the application

like image 11
Hari Avatar answered Oct 06 '22 09:10

Hari


You can use Spark's event logging and history server features to view the UIs of completed applications; see https://spark.apache.org/docs/latest/monitoring.html for more details.

like image 6
Josh Rosen Avatar answered Oct 06 '22 08:10

Josh Rosen