I'm running a spark job written in Scala 2.10.4 and running on Spark 1.4.0 cluster (based on HDFS and managed with YARN) and using Jackson modules version 2.6.1 on Maven repository
When running the code locally from my IDE (IntelliJ IDEA v14) everything works on the on-memory cluster, but when running the job on my remote cluster (EMR cluster on AWS VPC) i'm getting the following exception:
java.lang.AbstractMethodError: com.company.scala.framework.utils.JsonParser$$anon$1.com$fasterxml$jackson$module$scala$experimental$ScalaObjectMapper$_setter_$com$fasterxml$jackson$module$scala$experimental$ScalaObjectMapper$$typeCache_$eq(Lorg/spark-project/guava/cache/LoadingCache;)V
at com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper$class.$init$(ScalaObjectMapper.scala:50)
at com.company.scala.framework.utils.JsonParser$$anon$1.<init>(JsonParser.scala:14)
at com.company.scala.framework.utils.JsonParser$.<init>(JsonParser.scala:14)
at com.company.scala.framework.utils.JsonParser$.<clinit>(JsonParser.scala)
at com.company.migration.Migration$.printAllKeys(Migration.scala:21)
at com.company.migration.Main$$anonfun$main$1.apply(Main.scala:22)
at com.company.migration.Main$$anonfun$main$1.apply(Main.scala:22)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:199)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:56)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
I tried to look over the web for the exception and with no luck. I also tried look for a similar question here and found just one thread with no acceptable answer and none of the answers helped me there.
Hope to find help here,
Thanks.
I'm answering the question for further views by other users.
I stopped using the ScalaObjectMapper
and started working with the regular ObjectMapper
.
val jacksonMapper = new ObjectMapper()
jacksonMapper.registerModule(DefaultScalaModule)
And it works fine for the time being. Attaching piggybox's comment to be helpful comment:
The only difference in code is to use classOf[...] to specify type for readValue as the 2nd parameter.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With