Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running spark scala example fails

Tags:

I'm new to both Spark and Scala. I've created an IntelliJ Scala project with SBT and added a few lines to build.sbt.

name := "test-one"  version := "1.0"  scalaVersion := "2.11.2"  libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0" 

My version of Scala is 2.10.4 but this problem also occurs with 2.11.2

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class     at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)     at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)     at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)     at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)     at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)     at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)     at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)     at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)     at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)     at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)     at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)     at TweeProcessor$.main(TweeProcessor.scala:10)     at TweeProcessor.main(TweeProcessor.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134) Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)     at java.security.AccessController.doPrivileged(Native Method)     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)     ... 23 more 

Tried looking up online, most answers point to a mismatch between API versions and Scala version, but none are specific to Spark.

like image 485
user2003470 Avatar asked Oct 14 '14 01:10

user2003470


2 Answers

spark-core_2.10 is built for use with 2.10.x versions of scala. You should use

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0" 

which will select the correct _2.10 or _2.11 version for your scala version.

Also make sure you're compiling against the same versions of scala and spark as the ones on the cluster where you're running this.

like image 54
lmm Avatar answered Sep 25 '22 19:09

lmm


Downgrade the scala version to 2.10.4

name := "test-one"  version := "1.0"  //scalaVersion := "2.11.2" scalaVersion := "2.10.4"  libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0" 
like image 32
BlitzKrieg Avatar answered Sep 22 '22 19:09

BlitzKrieg