I am trying to launch the spark shell for python from the directory using ./bin/pyspark
When i run the command i get the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
It appears that I am missing: org.apache.spark.launcher.main
I am not quite sure how to resolve this issue and was wondering if anyone had any suggestions or had run into similar issues.
Thanks
I also ran into this problem (on my Mac).
I followed the steps in this page: https://stackoverflow.com/a/14875241/5478610 and was able to get past this error.
I think the problem is that even though I had installed Java 8 on my Mac, when I ran from the command line it was still calling Java 6.
The jar file with the class (in ./lib/spark-assembly-1.5.1-hadoop2.6.0.jar) couldn't be opened with Java 6. But once I updated my the links so that calling java from the terminal would use Java 8, I was able to bring up pyspark.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With