as titled, how do I know which version of spark has been installed in the CentOS?
The current system has installed cdh5.1.0.
To test if your installation was successful, open Command Prompt, change to SPARK_HOME directory and type bin\pyspark. This should start the PySpark shell which can be used to interactively work with Spark.
You can check your Spark setup by going to the /bin directory inside {YOUR_SPARK_DIRECTORY} and running the spark-shell –version command. Here you can see which version of Spark you have and which versions of Java and Scala it is using. That's it!
Spark 1.4. 1! This is a maintenance release that includes contributions from 85 developers. Spark 1.4. 1 includes fixes across several areas of Spark, including the DataFrame API, Spark Streaming, PySpark, Spark SQL, and MLlib.
If you use Spark-Shell, it appears in the banner at the start.
Programatically, SparkContext.version
can be used.
Open Spark shell Terminal, run sc.version
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With