I've been following this tutorial to install spark for scala: https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm
However, When I try to run spark-shell
I receive this error in my console.
/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory
My bashrc looks like this:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python
So what am I getting wrong? I've installed spark for python before but now I'm trying to use scala. Is spark confusing the variables? Thanks.
Based on @Wilmerton's answer, I came up with the following working configuration inside my ~/.bashrc
:
# Apache Spark stuff
export JAVA_HOME=/usr/lib/jvm/default-java/jre
export SPARK_HOME=/usr/lib/spark
export SCALA_HOME=/usr/local/scala/bin
export PATH=$PATH:${SPARK_HOME}/bin
export PATH=$PATH:$SCALA_HOME
(I installed default-jdk
with apt-get install default-jdk
and aptitude search jdk
yields the following entries with different from p
status:
i default-jdk - Standard Java or Java compatible Development Kit
i A default-jdk-headless - Standard Java or Java compatible Development Kit (headless)
i A openjdk-8-jdk - OpenJDK Development Kit (JDK)
i A openjdk-8-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iBA openjdk-8-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-8-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
i openjdk-9-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iB openjdk-9-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-9-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
)
You have one bin
too many in the path it's searching:
/usr/local/spark/bin/bin/spark-submit
should be
/usr/local/spark/bin/spark-submit
The SPARK_HOME
should be /usr/local/spark/
in your case, not /usr/local/spark/bin/
as it seems to be the case now.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With