This combination of Hbase / Spark versions appears to be pretty toxic. I have spent hours trying to find various MergeStrategy's that would work but to no avail.
Here is the core of the present build.sbt:
val sparkVersion = "1.0.0"
// val sparkVersion = "1.1.0-SNAPSHOT"
val hbaseVersion = "0.96.1.1-cdh5.0.2"
libraryDependencies ++= Seq(
"org.apache.hbase" % "hbase-client" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion,
"org.apache.hbase" % "hbase-protocol" % hbaseVersion,
"org.apache.hbase" % "hbase-examples" % hbaseVersion,
("org.apache.spark" % "spark-core_2.10" % sparkVersion withSources()).excludeAll(ExclusionRule("org.mortbay.jetty")),
"org.apache.spark" % "spark-sql_2.10" % sparkVersion withSources()
)
The following is the error message that inevitably resurfaces:
14/06/27 19:49:24 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:65)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:58)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:42)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:222)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
at com.huawei.swlab.sparkpoc.hbase.HBasePop$.main(HBasePop.scala:31)
at com.huawei.swlab.sparkpoc.hbase.HBasePop.main(HBasePop.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last *:runMain for the full output.
14/06/27 19:49:44 INFO ConnectionManager: Selector thread was interrupted!
java.lang.RuntimeException: Nonzero exit code: 1
I was getting the exact same exception with my Spark/HBase application. I fixed it by moving the org.mortbay.jetty
exclusion rule to my hbase-server dependency:
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.6-cdh5.2.0" excludeAll ExclusionRule(organization = "org.mortbay.jetty")
If you have hadoop-common
as one of your direct dependencies, then I also found it necessary to create an exclusion rule for javax.servlet
depdendencies:
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.5.0-cdh5.2.0" excludeAll ExclusionRule(organization = "javax.servlet")
I left my Spark dependencies untouched:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0-cdh5.2.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.1.0-cdh5.2.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0-cdh5.2.0"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With