I have Spark 1.6.2 and Spark 2.0 installed on my hortonworks cluster.
Both these versions are installed on a node in the Hadoop Cluster of 5 nodes.
Each time I start the spark-shell
I get:
$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
When I check the version I get:
scala> sc.version
res0: String = 1.6.2
How can I start the other version(spark-shell of Spark2.0)?
To do this, set the SPARK_MAJOR_VERSION environment variable to the desired version before you launch the job. For example, if Spark 1.6. 2 and the Spark 2.0 technical preview are both installed on a node, and you want to run your job with Spark 2.0, set SPARK_MAJOR_VERSION to 2.0 .
Spark is an inbuilt component of CDH and moves with the CDH version releases. There is no way to downgrade just a single component of CDH as they are built to work together in the versions carried.
If you are using pyspark, the spark version being used can be seen beside the bold Spark logo as shown below: manoj@hadoop-host:~$ pyspark Python 2.7. 6 (default, Jun 22 2015, 17:58:13) [GCC 4.8. 2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
Spark's shell provides a simple way to learn the API, as well as a powerful tool to analyze data interactively. It is available in either Scala (which runs on the Java VM and is thus a good way to use existing Java libraries) or Python. Start it by running the following in the Spark directory: Scala. Python.
export SPARK_MAJOR_VERSION=2
You just need to give the major version 2 or 1.
$ export SPARK_MAJOR_VERSION=2
$ spark-submit --version
SPARK_MAJOR_VERSION is set to 2, using Spark2
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0.2.5.0.0-1245
Working this approach:
spark-shell
loads Spark 1.6
whilst typing
spark2-shell
loads Spark 2.0
$ SPARK_MAJOR_VERSION=2 spark-shell
use spark2-submit, pyspark2 or spark2-shell
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With