Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cannot load main class from JAR file

I have an application Spark-scala, I tried to dispaly a simple message "Hello my App". when I compile it by sbt compile it's fine, also I run it by sbt run it's fine, I displayed my message with success but he display an error; like this:

Hello my application!
16/11/27 15:17:11 ERROR Utils: uncaught error in thread SparkListenerBus,   stopping SparkContext
        java.lang.InterruptedException
     ERROR ContextCleaner: Error in cleaning thread
    java.lang.InterruptedException
     at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
    16/11/27 15:17:11 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
    [success] Total time: 13 s, completed Nov 27, 2016 3:17:12 PM
    16/11/27 15:17:12 INFO DiskBlockManager: Shutdown hook called

I can't understand, it's fine or no! Also when i try to load my file jar after the run, he dispaly also an error:

my command line look like:

spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

And the error this is:

Error: Cannot load main class from JAR file:/root/projectFilms/appfilms
Run with --help for usage help or --verbose for debug output
16/11/27 15:24:11 INFO Utils: Shutdown hook called

Please can you answers me!

like image 839
sirine Avatar asked Nov 27 '16 15:11

sirine


People also ask

Why could not find or load main class?

Reasons to Occur Error The error generates because the JVM fails to load the main class or package name. There are some other reasons that generate the same error, as follows: The class has been declared in the wrong package. Dependencies missing from the CLASSPATH.

How do I find the main class in a jar file?

Technically a jar file can contain more than one main class. When java executes a jar file, it looks in the META-INF/MANIFEST. MF file inside the jar to find the entrypoint. There is no direct command to get this information, but you can unpack the jar (it's just a zip file) and look into the manifest yourself.


2 Answers

The error is due to the fact that the SparkContext is not stopped, this is required in versions higher than Spark 2.x. This should be stopped to prevent this error by SparkContext.stop(), or sc.stop(). Inspiration for solving this error is gained from own experiences and the following sources: Spark Context, Spark Listener Bus error

like image 143
Paul Velthuis Avatar answered Sep 29 '22 17:09

Paul Velthuis


You forgot to use --class Parameter spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar

spark-submit --class "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar.

Please note if appfilm belong to any package dont forgot to add package name as below packagename.appfilms

I believe this will suffice

like image 42
devD Avatar answered Sep 29 '22 17:09

devD