Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

Any reason why I get this error ? Initially the IDE plugin for Scala was 2.12.3. But since I'm working with Spark 2.2.0, I manually changed it to Scala 2.11.11.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at scala.xml.Null$.<init>(Null.scala:23)
    at scala.xml.Null$.<clinit>(Null.scala)
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at sparkEnvironment$.<init>(Ticket.scala:33)
    at sparkEnvironment$.<clinit>(Ticket.scala)
    at Ticket$.main(Ticket.scala:39)
    at Ticket.main(Ticket.scala)
like image 782
TheShark Avatar asked Sep 19 '17 06:09

TheShark


People also ask

What is Exception in thread main Java Lang NoSuchMethodError?

NoSuchMethodError: main Exception in thread "main" can come due to various reasons like: 1) The class which you are trying to run doesn't have the main method. 2) The signature of the main method is not correct. See here for all possible signatures of the main method in Java.

What is Scala product?

Product2 is a trait in Scala, which is a Cartesian product of two elements. In build-in classes it can be considered as tuple of two elements. The Linear Supertypes here are Product, Equals, Any, and the sub-class here is Tulple2.


1 Answers

Make sure Spark is compatible with corresponding Scala version

The error is common when using Scala version 2.12 series with any version of Spark offering Scala 2.11.

You can try using the 2.11 series of Scala with Spark . i.e.

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

As you can see in this dependency spark-core_2.11 is associated with scala version 2.11.

That's why it's safer (more compatible) to use %% and avoid hardcoding the version of Scala in Spark dependencies. Let the tool resolve the required Scala version for you automatically as follows:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

The above declaration will automatically infer the scala version.

like image 91
Akash Sethi Avatar answered Sep 18 '22 13:09

Akash Sethi