I updated my PATH to look like this:
PATH="$HOME/bin:$HOME/.local/bin:$PATH:/home/username/Installs/Spark/bin"
I think it worked as I managed to call spark-shell from a different folder (although I'm wondering if I'm going crazy and it was really from the bin folder). However after rebooting Ubuntu it no longer seems to work. Why?
Could not find valid SPARK_HOME while searching ['/home/username', '/usr/local/bin']
/usr/local/bin/spark-shell: line 57: /bin/spark-submit: No such file or directory
Setting
PATH="$HOME/bin:$HOME/.local/bin:$PATH:/home/username/Installs/Spark/bin"
would enable to run the executable scripts like spark-shell
, spark-submit
, pyspark
etc. without need to give full path to the scripts.
Besides setting PATH
, you would need to set
SPARK_HOME=/home/username/Installs/Spark
which is used internally when you start spark cluster or when you use spark-submit
.
If you are setting the variables in .bashrc
file, you need export
keyword too as
export SPARK_HOME=/home/username/Installs/Spark
and if you don't want to reboot Ubuntu to test it worked type
. ~/.profile
into the command line then try your spark command.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With