I want to run a spark streaming application on a yarn cluster on a remote server. The default java version is 1.7 but i want to use 1.8 for my application which is also there in the server but is not the default. Is there a way to specify through spark-submit the location of java 1.8 so that i do not get major.minor error ?
JAVA_HOME was not enough in our case, the driver was running in java 8, but I discovered later that Spark workers in YARN were launched using java 7 (hadoop nodes have both java version installed).
I had to add spark.executorEnv.JAVA_HOME=/usr/java/<version available in workers>
in spark-defaults.conf
. Note that you can provide it in command line with --conf
.
See http://spark.apache.org/docs/latest/configuration.html#runtime-environment
Although you can force the Driver code to run on a particular Java version (export JAVA_HOME=/path/to/jre/ && spark-submit ...
), the workers will execute the code with the default Java version from the yarn user's PATH from the worker machine.
What you can do is set each Spark instance to use a particular JAVA_HOME
by editing the spark-env.sh
files (documentation).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With