Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark exception with java.lang.ClassNotFoundException: de.unkrig.jdisasm.Disassembler

I'm running the spark version 2.1.0 and I'm getting the following exception. I'm getting the results but it's throwing the exception

java.lang.ClassNotFoundException: de.unkrig.jdisasm.Disassembler
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.codehaus.janino.SimpleCompiler.disassembleToStdout(SimpleCompiler.java:430)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:404)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:311)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:229)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:196)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:91)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:935)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:998)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:995)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:890)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:357)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
    at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:225)
    at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:272)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2371)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2370)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2377)
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2405)
    at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2404)
    at org.apache.spark.sql.Dataset.withCallback(Dataset.scala:2778)
    at org.apache.spark.sql.Dataset.count(Dataset.scala:2404)

I'm just reading the text file and print the count

val rdd = sparkLocal.read.text("/data/logs/file.log")
println(rdd.count)

But I'm getting the result

This is my build.sbt

libraryDependencies ++= {
  val akkaVersion = "2.4.10"
  val sparkVersion = "2.1.0"

  Seq(
    "com.typesafe.akka" %% "akka-actor"                           % akkaVersion,
    "org.apache.spark"  %%  "spark-core"                          % sparkVersion,
    "org.apache.spark"  %%  "spark-sql"                           % sparkVersion,
    "org.apache.spark"  %%  "spark-hive"                          % sparkVersion,
    "com.typesafe.akka" %% "akka-slf4j"                           % akkaVersion,
    "org.apache.spark"  %% "spark-streaming"                      % sparkVersion
  )
}

Anyone please help me

like image 527
Muhunthan Avatar asked Mar 07 '17 05:03

Muhunthan


1 Answers

Having the same issue, which seems to be caused by this issue of the Janino compiler.

The issue has been closed and according to the change log the fix should be included in version Version 3.0.7, released on 2017-03-22, which is already on Maven.

I have added an explicit

libraryDependencies += "org.codehaus.janino" % "janino" % "3.0.7"

to my build.sbt and so far it seems to solve the problem.

like image 115
bluenote10 Avatar answered Nov 19 '22 05:11

bluenote10