Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()

I just upgraded my spark project from 2.2.1 to 2.3.0 to find the versioning exception below. I have dependencies on the spark-cassandra-connector.2.0.7 and cassandra-driver-core.3.4.0 from datastax which in turn have dependencies on netty 4.x whereas spark 2.3.0 uses 3.9.x.

The class raising the exception, org.apache.spark.network.util.NettyMemoryMetrics, has been introduced in spark 2.3.0.

Is downgrading my Cassandra dependencies the only way round the exception? Thanks!

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
like image 765
rodders Avatar asked Mar 06 '18 18:03

rodders


2 Answers

It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.

like image 167
Norman Maurer Avatar answered Oct 29 '22 22:10

Norman Maurer


I would like to add some more details to the answer for ease of work, just run mvn dependency:tree -Dverbose -Dincludes=io.netty:netty-all it will return all the dependencies using io.netty and its version. In my case the culprit was Hive Jdbc 2.1.0 which has netty-all of version lower than the version used by spark 2.3.1 so the classpath omits to load the spark's netty as it was already loaded from hive-jdbc.

So the fix is to exclude the dependencies from the Hive-Jdbc in pom.xml

like image 6
Vicky Avatar answered Oct 30 '22 00:10

Vicky