Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark2.1.0 incompatible Jackson versions 2.7.6

I am trying to run a simple spark example in intellij, but I get the error like that:

Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.spark.SparkContext.withScope(SparkContext.scala:701) at org.apache.spark.SparkContext.textFile(SparkContext.scala:819) at spark.test$.main(test.scala:19) at spark.test.main(test.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.7.6 at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64) at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19) at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:730) at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 

I have try to update my Jackson dependency, but it seems not work, I do this:

libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7" 

but it still appear the same error messages, can some one help me to fix the error?

Here is spark example code:

object test { def main(args: Array[String]): Unit = {     if (args.length < 1) {         System.err.println("Usage: <file>")         System.exit(1)     }      val conf = new SparkConf()     val sc = new SparkContext("local","wordcount",conf)     val line = sc.textFile(args(0))      line.flatMap(_.split(" ")).map((_, 1)).reduceByKey(_+_).collect().foreach(println)      sc.stop()     } } 

And here is my built.sbt:

name := "testSpark2"  version := "1.0"  scalaVersion := "2.11.8"   libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"  libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-repl_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume_2.10" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-network-shuffle_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume-assembly_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mesos_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-catalyst_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-launcher_2.11" % "2.1.0" 
like image 914
Yang Avatar asked May 08 '17 06:05

Yang


1 Answers

Spark 2.1.0 contains com.fasterxml.jackson.core as transitive dependency. So, we do not need to include then in libraryDependencies.

But if you want to add a different com.fasterxml.jackson.core dependencies' version then you have to override them. Like this:

name := "testSpark2"  version := "1.0"  scalaVersion := "2.11.8"   dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7" dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7" dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7"  libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-repl_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-network-shuffle_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-streaming-flume-assembly_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-mesos_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-catalyst_2.11" % "2.1.0" libraryDependencies += "org.apache.spark" % "spark-launcher_2.11" % "2.1.0" 

So, change your build.sbt like the one above and it will work as expected.

I hope it helps!

like image 119
himanshuIIITian Avatar answered Oct 05 '22 13:10

himanshuIIITian