Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running Spark on Linux : $JAVA_HOME not set error

I am trying to configure spark-2.0.0-bin-hadoop2.7 on Ubuntu 16.04.1 LTS. I have set

export JAVA_HOME=/home/marc/jdk1.8.0_101
export SCALA_HOME=/home/marc/scala-2.11.8
export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7
export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin

at the end of .bashrc and also included in the start-all.sh file from spark/sbin folder

when I type echo $JAVA_HOME it gives me the correct path as /home/marc/jdk1.8.0_101

But when I call sbin/start-all.sh

It gives me the following error

localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set

I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.

like image 395
Marc Zaharescu Avatar asked Aug 03 '16 15:08

Marc Zaharescu


People also ask

How do you fix please set the JAVA_HOME variable in your environment to match the location of your Java installation?

To set JAVA_HOME, do the following: Right click My Computer and select Properties. On the Advanced tab, select Environment Variables, and then edit JAVA_HOME to point to where the JDK software is located, for example, C:\Program Files\Java\jdk1.

What happens if JAVA_HOME is not set?

If any program that requires a Java runtime fails to find the JAVA_HOME environment variable upon startup, or if the JAVA_HOME environment variable is misconfigured, it will result in some of the following error messages to be displayed: A Java installation exists but JAVA_HOME has been set incorrectly.

Why JAVA_HOME is not working?

Verify JAVA_HOME Enter the command echo %JAVA_HOME% . This should output the path to your Java installation folder. If it doesn't, your JAVA_HOME variable was not set correctly. Please make sure you're using the correct Java installation folder, or repeat the steps above.

How do I change Java version in spark?

like this: "JAVA_HOME=/path/to/java ./bin/spark-submit ......" setting JAVA_HOME before the spark-submit command worked for me. Thanks :) @Hlib , doing so changed the java version for the current application for the driver and not the executors in the cluster which also have their default java version as 1.7.


2 Answers

You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.

like image 91
Haoran Yang Avatar answered Sep 21 '22 11:09

Haoran Yang


Try installing Java at your computer:

First, check if it is there:

java -version

If not installed:

sudo apt-get update
sudo apt-get install openjdk-8-jdk

This should fix the problem.

like image 21
Guilherme Viegas Avatar answered Sep 19 '22 11:09

Guilherme Viegas