Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to find Spark's installation directory?

I want to change spark-env.sh . How can I find the installation directory in ubuntu??

I looked in the UI but didn't find anything.

whereis spark  

result : spark:

Here's the log for locate command locate spark

/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/11 /home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/13 /home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/files /home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/jars /home/sys6002/Desktop/diff spark hadoop.png /home/sys6002/Desktop/sparkmain /home/sys6002/Downloads/learning-spark-master.zip /home/sys6002/Downloads/mongo-spark-master /home/sys6002/Downloads/spark-1.5.1 /home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 /home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 (2) /home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6.tgz /home/sys6002/Downloads/spark-1.5.1-bin-without-hadoop /home/sys6002/Downloads/spark-cassandra-connector-master /home/sys6002/Downloads/spark-core_2.9.3-0.8.0-incubati home/sys6002/anaconda3/pkgs/odo-0.3.2-np19py34_0/lib/python3.4/site-packages/odo/backends/tests/__pycache__/test_sparksql.cpython-34.pyc /home/sys6002/spark-example/a.txt /home/sys6002/spark-example/a.txt~ /home/sys6002/spark-example/pom.xml /home/sys6002/spark-example/pom.xml~ /home/sys6002/spark-example/src /home/sys6002/spark-example/src/main /home/sys6002/spark-example/src/test /home/sys6002/spark-example/src/main/java /home/sys6002/spark-example/src/main/java/com /home/sys6002/spark-example/src/main/java/com/geekcap /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/App.java /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/WordCount.java~ /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java /home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java~  /home/sys6002/spark-example/src/test/java/com/geekcap/javaworld/AppTest.java /usr/share/app-install/desktop/lightspark:lightspark.desktop /usr/share/app-install/desktop/sparkleshare:sparkleshare-invite-opener.desktop /usr/share/app-install/desktop/sparkleshare:sparkleshare.desktop 
like image 314
Anil Avatar asked Nov 19 '15 14:11

Anil


People also ask

How do you check for Spark installation?

To test if your installation was successful, open Command Prompt, change to SPARK_HOME directory and type bin\pyspark. This should start the PySpark shell which can be used to interactively work with Spark. The last message provides a hint on how to work with Spark in the PySpark shell using the sc or sqlContext names.

Where is Spark path in Windows?

Step 5: Open command prompt and go to your spark bin folder (type cd C:\Users\Desktop\A\spark\bin). Type spark-shell. It will show some warnings and errors but ignore. It works.

Where is Spark conf directory?

The default Apache Spark configuration directory is $SPARK_HOME/conf. In accordance with the Filesystem Hierarchy Standard (FHS), this task creates a new configuration directory under /etc.


1 Answers

Run

echo 'sc.getConf.get("spark.home")' | spark-shell 

After a moment your Spark home will be printed, you'll see something like this:

scala> sc.getConf.get("spark.home") res0: String = /usr/local/lib/python3.7/site-packages/pyspark 

So in this case my Spark Home is /usr/local/lib/python3.7/site-packages/pyspark

like image 193
Carter Shanklin Avatar answered Oct 24 '22 09:10

Carter Shanklin