I need to exclude spark and test dependencies from my final assembly jar. I tried to use provider
but it was not working.
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.11" % "2.0.1" % "provided")
and execute sbt assembly
.
Please help me resolve this issue.
We exclude dependencies in SBT by using either the exclude or excludeAll keywords. It's important to understand which one to use: excludeAll is more flexible but cannot be represented in a pom. xml. Therefore, we should use exclude if a pom.
The sbt-assembly plugin is an SBT plugin for building a single independent fat JAR file with all dependencies included. This is inspired by the popular Maven assembly plugin, which is used to build fat JARs in Maven. Sometimes, we may not want certain libraries to be part of our JAR file.
Use exclude option of assembly plugin filtering by direct name or with contains
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { f =>
f.data.getName.contains("spark-core") ||
f.data.getName == "spark-core_2.11-2.0.1.jar"
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With