I'm using ubuntu to setup standalone spark. But it failed to find the slf4j package while I'm using the pre-built spark.
./spark-1.4.1-bin-without-hadoop/sbin/start-master.sh
Spark Command: /usr/lib/jvm/java-7-oracle//bin/java -cp /root/spark-1.4.1-bin-without-hadoop/sbin/../conf/:/root/spark-1.4.1-bin-without-hadoop/lib/spark-assembly-1.4.1-hadoop2.2.0.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip localhost --port 7077 --webui-port 8080
========================================
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2615)
at java.lang.Class.getMethod0(Class.java:2856)
at java.lang.Class.getMethod(Class.java:1668)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
I'm sorry that I should not use spark-1.4.1-bin-without-hadoop and it works when I download spark-1.4.1-bin-hadoop2.6. So that could be the problem of my usage, not Spark.
1. Launch PySpark Shell Command. Go to the Spark Installation directory from the command line and type bin/pyspark and press enter, this launches pyspark shell and gives you a prompt to interact with Spark in Python language.
Just check http://master:8088 where master is pointing to spark master machine. There you will be able to see spark master URI, and by default is spark://master:7077, actually quite a bit of information lives there, if you have a spark standalone cluster.
Click on the HDFS Web UI. A new web page is opened to show the Hadoop DFS (Distributed File System) health status. Click on the Spark Web UI. Another web page is opened showing the spark cluster and job status.
An easy fix could be use the classpath from the hadoop classpath command, as suggested in the Spark documentation:
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With