Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to use json4s 3.2.11 with Spark 1.3.0?

Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10.

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
}

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

Is there a way to force 3.2.11 version usage?

like image 698
Alexey Zinoviev Avatar asked Mar 23 '15 17:03

Alexey Zinoviev


3 Answers

We ran into a problem similar to the one Necro describes, but downgrading from 3.2.11 to 3.2.10 when building the assembly jar did not resolve it. We ended up solving it (using Spark 1.3.1) by shading the 3.2.11 version in the job assembly jar:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("org.json4s.**" -> "shaded.json4s.@1").inAll
)
like image 63
user2303197 Avatar answered Oct 07 '22 02:10

user2303197


I asked the same question in the Spark User Mailing List, and got two answers how to make it work:

  1. Use spark.driver.userClassPathFirst=true and spark.executor.userClassPathFirst=true options, but it works only in Spark 1.3 and probably will require some other modifications like excluding Scala classes from your build.

  2. Rebuild Spark with json4s 3.2.11 version (you can change it in core/pom.xml)

Both work fine, I prefered the second one.

like image 26
Alexey Zinoviev Avatar answered Oct 07 '22 04:10

Alexey Zinoviev


This is not an answer to your question but this came up when searching for my problem. I was getting a NoSuchMethod exception in formats.emptyValueStrategy.replaceEmpty(value) in json4s's 'render'. The reason was I was building with 3.2.11 but Spark was linking 3.2.10. I downgraded to 3.2.10 and my problem went away. Your question helped me understand what was going on (that Spark was linking a conflicting version of json4s) and I was able to resolve the problem, so thanks.

like image 45
Necro Avatar answered Oct 07 '22 03:10

Necro