I'm an Upstart newbie (and a Spark newbie for that matter),
I've been able to start a spark standalone server using:
./spark-1.5.2-bin-hadoop2.4/sbin/start-master.sh
and I want this to start automatically every time the computer is turned on, I looked up Upstart and wrote this simple conf file:
description "satrt a spark master with Upstart"
author "Ezer"
exec bash -c '/spark-1.5.2-bin-hadoop2.4/sbin/start-master start'
it does not work and I get the filling I'm missing something basic, any help will be appreciated.
How about
export SPARK_HOME={YOUR_ACTUAL_SPARK_HOME_PATH}
exec bash $SPARK_HOME/sbin/start-all.sh
in your upstart conf file? However, note the script spawn processes so you cannot actually manage the service with upstart.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With