Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

flink: scala version conflict?

Am attempting to compile the kafka sample from here in IntelliJ. After much fussing with dependencies have run into this issue that I can't seem to get past:

15/10/25 12:36:34 ERROR actor.ActorSystemImpl: Uncaught fatal error from thread [flink-akka.actor.default-dispatcher-4] shutting down ActorSystem [flink]
java.lang.NoClassDefFoundError: scala/runtime/AbstractPartialFunction$mcVL$sp
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.handleMessage(MemoryArchivist.scala:80)
at org.apache.flink.runtime.FlinkActor$class.receive(FlinkActor.scala:32)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.org$apache$flink$runtime$LogMessages$$super$receive(MemoryArchivist.scala:59)
at org.apache.flink.runtime.LogMessages$class.receive(LogMessages.scala:26)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.receive(MemoryArchivist.scala:59)
at akka.actor.ActorCell.newActor(ActorCell.scala:567)
at akka.actor.ActorCell.create(ActorCell.scala:587)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:460)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:482)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282)
at akka.dispatch.Mailbox.run(Mailbox.scala:223)
at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)


Caused by: java.lang.ClassNotFoundException: scala.runtime.AbstractPartialFunction$mcVL$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 28 more

I've run across a few notions that suggest this is an issue with the scala version. Current library list:

flink-runtime-1.0-SNAPSHOT
flink-streaming-java-1.0-SNAPSHOT
flink-connector-kafka-1.0-SNAPSHOT
flink-java8-1.0-SNAPSHOT
flink-core-1.0-SNAPSHOT
flink-java-1.0-SNAPSHOT
org.apache.hadoop:hadoop-core:1.2.1
flink-clients-1.0-SNAPSHOT
org.apache.kafka:kafka-clients:0.8.2.2
org.apache.kafka:kafka_2.11:0.8.2.2
flink-optimizer-1.0-SNAPSHOT
org.apache.sling:org.apache.sling.commons.json:2.0.6
de.javakaffee:kryo-serializers:0.28
com.github.scopt:scopt_2.11:3.3.0
org.clapper:grizzled-slf4j_2.9.0:0.6.6
com.typesafe.akka:akka-osgi_2.11:2.4.0
com.typesafe.akka:akka-slf4j_2.11:2.4.0

Suggestions on where I've run astray?

like image 996
ethrbunny Avatar asked Oct 25 '15 20:10

ethrbunny


1 Answers

The problem is indeed a Scala version mismatch. You're mixing dependencies which are built for Scala 2.11, e.g. org.apache.kafka:kafka_2.11:0.8.2.2 with Flink dependencies which are built by default for Scala 2.10.

One of the dependencies built for Scala 2.11 pulls in the scala-library:2.11 jar which replaces the scala-library:2.10 dependency required by the Flink dependencies. You either use the binaries built for Scala 2.10 for the non-Flink dependencies or you build and install Flink using Scala 2.11. See https://ci.apache.org/projects/flink/flink-docs-master/setup/building.html#build-flink-for-a-specific-scala-version for how to build Flink with different Scala version.

Kafka example

If you just want to bump the version of the referenced Kafka example to 0.10-SNAPSHOT you have to change the Flink version in the pom.xml file and you have to use the FlinkKafkaProducer instead of the KafkaSink in the WriteIntoKafka.java file. You don't need the SimpleStringSchema then anymore. That is all you have to change (no additional dependencies are required).

like image 83
Till Rohrmann Avatar answered Oct 23 '22 20:10

Till Rohrmann