I have downloaded and built Spark 0.80 using sbt/sbt assembly
. It was successful. However when running ./bin/start-master.sh
the following error is seen in the log file
Spark Command: /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp :/shared/spark-0.8.0-incubating-bin-hadoop1/conf:/shared/spark-0.8.0-incubating-bin-hadoop1/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar
/shared/spark-0.8.0-incubating-bin-hadoop1/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip mellyrn.local --port 7077 --webui-port 8080
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/deploy/master/Master
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.master.Master
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Update: after doing sbt clean (per suggestion below) it is running: see screenshot.
There can be a number of things which cause this error which are not specific to Spark:
sbt clean compile
that puppy again.Try looking at those first.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With