Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

java.lang.NoSuchMethodError Jackson databind and Spark

I am trying to run spark-submit with Spark 1.1.0 and Jackson 2.4.4. I have scala code which uses Jackson to de-serialize JSON into case classes. That works just fine on its own, but when I use it with spark I get the following error:

15/05/01 17:50:11 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 2)
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.introspect.POJOPropertyBuilder.addField(Lcom/fasterxml/jackson/databind/introspect/AnnotatedField;Lcom/fasterxml/jackson/databind/PropertyName;ZZZ)V
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector.com$fasterxml$jackson$module$scala$introspect$ScalaPropertiesCollector$$_addField(ScalaPropertiesCollector.scala:109)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:100)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:99)
    at scala.Option.foreach(Option.scala:236)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:99)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:93)
    at scala.collection.GenTraversableViewLike$Filtered$$anonfun$foreach$4.apply(GenTraversableViewLike.scala:109)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.SeqLike$$anon$2.foreach(SeqLike.scala:635)
    at scala.collection.GenTraversableViewLike$Filtered$class.foreach(GenTraversableViewLike.scala:108)
    at scala.collection.SeqViewLike$$anon$5.foreach(SeqViewLike.scala:80)
    at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector._addFields(ScalaPropertiesCollector.scala:93)

Here is my build.sbt:

//scalaVersion in ThisBuild := "2.11.4"
scalaVersion in ThisBuild := "2.10.5"

retrieveManaged := true

libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value

libraryDependencies ++= Seq(
  "junit" % "junit" % "4.12" % "test",
  "org.scalatest" %% "scalatest" % "2.2.4" % "test",
  "org.mockito" % "mockito-core" % "1.9.5",
  "org.specs2" %% "specs2" % "2.1.1" % "test",
  "org.scalatest" %% "scalatest" % "2.2.4" % "test"
)

libraryDependencies ++= Seq(
  "org.apache.hadoop" % "hadoop-core" % "0.20.2",
  "org.apache.hbase" % "hbase" % "0.94.6"
)

//libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"


libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.4"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.3.1"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.5.0"

libraryDependencies += "com.typesafe" % "config" % "1.2.1"

resolvers += Resolver.mavenLocal

As you can see, I have tried many different versions of Jackson.

Here is the shell script I use to run spark submit:

#!/bin/bash
sbt package

CLASS=com.org.test.spark.test.SparkTest

SPARKDIR=/Users/user/Desktop/
#SPARKVERSION=1.3.0
SPARKVERSION=1.1.0
SPARK="$SPARKDIR/spark-$SPARKVERSION/bin/spark-submit"

jar_jackson=/Users/user/scala_projects/lib_managed/bundles/com.fasterxml.jackson.module/jackson-module-scala_2.10/jackson-module-scala_2.10-2.4.4.jar

"$SPARK" \
  --class "$CLASS" \
  --jars $jar_jackson \
  --master local[4] \
  /Users/user/scala_projects/target/scala-2.10/spark_project_2.10-0.1-SNAPSHOT.jar \
  print /Users/user/test.json

I use --jars to the path of the jackson jar to the spark-submit command. I have even tried different versions of Spark. I have also even specified the paths for the Jackson jars databind, annotations, etc but that didn't resolve the issue. Any help would be appreciated. Thank you

like image 430
user1077071 Avatar asked May 02 '15 09:05

user1077071


Video Answer


1 Answers

I had the same problem where my play-json jar was using jackson 2.3.2 and spark was using jackson 2.4.4.
While I was running the spark application, it was unable to find the method in jackson-2.3.2 and I got the same exception.

I checked the maven dependency hierarchy for jackson. It displayed the version it took and which jar (Here play used 2.3.2) and as my play-json placed at first in dependency list, it took 2.3.2 version.

So I tried placing the play dependency at the end of the all dependencies/after the spark dependency and it worked pretty well. It took 2.4.4 this time and version 2.3.2 is omitted.

Source:

Note that if two dependency versions are at the same depth in the dependency tree, until Maven 2.0.8 it was not defined which one would win, but since Maven 2.0.9 it's the order in the declaration that counts: the first declaration wins.

like image 97
Yuvraj Beegala Avatar answered Oct 10 '22 16:10

Yuvraj Beegala