Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SBT Test Error: java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream

Getting Below exception , when i tried to perform unit tests for my spark streaming code on SBT windows using scalatest.

sbt testOnly <<ClassName>>

*
*
*
*
*
*

2018-06-18 02:39:00 ERROR Executor:91 - Exception in task 1.0 in stage 3.0 (TID 11) java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.(Ljava/io/InputStream;Z)V at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122) at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163) at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124) at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50) at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50) at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:417) at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:61) at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32) at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.sort_addToSorter$(Unknown Source) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) at org.apache.spark.sql.execution.GroupedIterator$.apply(GroupedIterator.scala:29) at org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec$StateStoreUpdater.updateStateForKeysWithData(FlatMapGroupsWithStateExec.scala:176)**

Tried couple of things to exclude net.jpountz.lz4 jar( with suggestions from other posts) but again same error in output.

Currently using spark 2.3 , scalatest 3.0.5, Scala 2.11 version . i see this issue only after upgrade to spark 2.3 and scalatest 3.0.5

Any suggestions ?

like image 484
KK2486 Avatar asked Jun 18 '18 10:06

KK2486


2 Answers

Kafka has a conflicting dependency with Spark and that's what caused this issue for me.

This is how you can exclude the dependency in you sbt file

lazy val excludeJpountz = ExclusionRule(organization = "net.jpountz.lz4", name = "lz4")

lazy val kafkaClients = "org.apache.kafka" % "kafka-clients" % userKafkaVersionHere excludeAll(excludeJpountz) // add more exclusions here

When you use this kafkaClients dependency it would now exclude the problematic lz4 library.


Update: This appears to be an issue with Kafka 0.11.x.x and earlier version. As of 1.x.x Kafka seems to have moved away from using the problematic net.jpountz.lz4 library. Therefore, using latest Kafka (1.x) with latest Spark (2.3.x) should not have this issue.

like image 115
marios Avatar answered Oct 20 '22 21:10

marios


This artifact "net.jpountz.lz4:lz4" was moved to: "org.lz4 » lz4-java"

By using; libraryDependencies += "org.lz4" % "lz4-java" % "1.7.1", the issue has been resolved.

like image 2
Prabin Mehta Avatar answered Oct 20 '22 22:10

Prabin Mehta