Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"?

I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar)

I am able to run the program fine in sbt.

I tried to run it from the command line as follows:

time ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar java -cp /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar com.dtex.analysis.transform.GenUserSummaryView -d /Users/arun/DataSets/LME -p output -s txt -o /Users/arun/tmp/LME/LME

I get the following error message:

Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at org.rogach.scallop.package$.(package.scala:37) at org.rogach.scallop.package$.(package.scala) at com.dtex.analysis.transform.GenUserSummaryView$Conf.delayedEndpoint$com$dtex$analysis$transform$GenUserSummaryView$Conf$1(GenUserSummaryView.scala:27) at com.dtex.analysis.transform.GenUserSummaryView$Conf$delayedInit$body.apply(GenUserSummaryView.scala:26) at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) at org.rogach.scallop.AfterInit$class.delayedInit(AfterInit.scala:12) at org.rogach.scallop.ScallopConf.delayedInit(ScallopConf.scala:26) at com.dtex.analysis.transform.GenUserSummaryView$Conf.(GenUserSummaryView.scala:26) at com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:54) at com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)

like image 913
user2947133 Avatar asked Jan 13 '15 05:01

user2947133


2 Answers

I was also encountered with the same issue with spark-submit, in my case:

Spark Job was compiled with : Scala 2.10.8

Scala version with which Spark was compiled on the cluster: Scala 2.11.8

To check the Spark version and Scala version on the cluster use "spark-shell" command.

After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.

like image 187
Ajit K'sagar Avatar answered Nov 17 '22 20:11

Ajit K'sagar


The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.

Move everything to Scala 2.10 version and make sure you update your SBT as well.

You may also try to compile Spark sources for Scala 2.11.7 and use it instead.

like image 22
bobo32 Avatar answered Nov 17 '22 20:11

bobo32