After the end of the execution of Spark submit, the Spark web UI is killed. Is there a way to keep it alive?
I am using Spark 1.2.1.
To add a newbie friendly step-by-step solution of how to work with the history server:
In the spark distribution folder try to start the history server by:
./sbin/start-history-server.sh
By default the history server will try to monitor /tmp/spark-events
for logs and unfortunately it will crash if the path doesn't exist. So if you get an error you may have to mkdir /tmp/spark-events
first. You can check the history server logs in ./logs
to see details in case of trouble.
In order for a context to persists its event log, you have to enable event logging. This can be done either programmatically or by editing ./conf/spark-defaults.conf
(copy the template if it doesn't exist yet), and uncommenting/adding the line:
spark.eventLog.enabled true
Running a spark-submit
should result in event log folders like /tmp/spark-events/local-1465133722470
.
Access the history server UI, typically from http://localhost:18080
If you are testing in local mode i.e. using IDEA or Eclipse, one way is to do is as following.
System.in.read();
spark.stop(); // spark --> SparkSession
This will ensure that the UI is accessible as long as you want. Just hit enter on the IDEA/Eclipse console to terminate the application
You can use Spark's event logging and history server features to view the UIs of completed applications; see https://spark.apache.org/docs/latest/monitoring.html for more details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With