Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SBT / ScalaTest: Configurations already specified for module

After cloning an SBT repo and trying to launch SBT shell inside the directory, I'm getting the following error

java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test

The complete stack-trace is shown below

[error] java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test
[error]     at scala.Predef$.require(Predef.scala:277)
[error]     at sbt.librarymanagement.DependencyBuilders.moduleIDConfigurable(DependencyBuilders.scala:30)
[error]     at sbt.librarymanagement.DependencyBuilders.moduleIDConfigurable$(DependencyBuilders.scala:29)
[error]     at sbt.package$.moduleIDConfigurable(package.scala:6)
[error]     at $080896ebbef320cbbd4a$.$anonfun$$sbtdef$2(/Users/username/company/repo/submodule/build.sbt:37)
[error]     at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:234)
[error]     at scala.collection.immutable.List.foreach(List.scala:389)
[error]     at scala.collection.TraversableLike.map(TraversableLike.scala:234)
[error]     at scala.collection.TraversableLike.map$(TraversableLike.scala:227)
[error]     at scala.collection.immutable.List.map(List.scala:295)
[error]     at $080896ebbef320cbbd4a$.$anonfun$$sbtdef$1(/Users/username/company/repo/submodule/build.sbt:37)
[error]     at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:197)
[error]     at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:214)
[error]     at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:159)
[error]     at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:82)
[error]     at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:93)
[error]     at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:89)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

What is the cause of this error and how to overcome it?


My project configurations are

  • Scala v2.11.11
  • SBT v1.0.3

UPDATE-1

Here's my build.sbt file

import AwsDependencies._
import Dependencies._
import SparkDependencies._

version := "0.0.1"

// core settings
organization := "com.company"
scalaVersion := "2.11.11"

// cache options
offline := false
updateOptions := updateOptions.value.withCachedResolution(true)

// aggregate options
aggregate in assembly := false
aggregate in update := false

// fork options
fork in Test := true

name := "Submodule"
version := "0.0.1"

//common libraryDependencies
libraryDependencies ++= Seq(
  scalaTest,
  typesafeConfig,
  jodaTime,
  mysql,
  json,
  scopt,
  awsS3,
  sparkTesting
)

libraryDependencies ++= SparkDependencies.allSparkDependencies.map(_ % "provided")

assemblyMergeStrategy in assembly := {
  case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
  case m if m.startsWith("META-INF") => MergeStrategy.discard
  case PathList("javax", "servlet", _@_*) => MergeStrategy.first
  case PathList("org", "apache", _@_*) => MergeStrategy.first
  case PathList("org", "jboss", _@_*) => MergeStrategy.first
  case "about.html" => MergeStrategy.rename
  case "reference.conf" => MergeStrategy.concat
  case "application.conf" => MergeStrategy.concat
  case _ => MergeStrategy.first
}

The stack-trace reports error on the following line of this build.sbt (of concerned submodule)

libraryDependencies ++= SparkDependencies.allSparkDependencies.map(_ % "provided")
like image 221
y2k-shubham Avatar asked Dec 21 '17 13:12

y2k-shubham


2 Answers

I know I might be a bit late with this answer, but it seems like one of the entries in your SparkDependencies.allSparkDependencies already contains % provided, so SparkDependencies.allSparkDependencies.map(_ % "provided") is attempting to add it again, which causes the issue. Try simply removing % provided from SparkDependencies.allSparkDependencies.

like image 54
Michał M. Avatar answered Oct 16 '22 03:10

Michał M.


I had the same problem with a different setup.

In my case the issue was caused by having a test specifier in two different locations in my sbt setup. In my Dependencies.scala:

 object Dependencies {
   lazy val someLibrary = "org.foo" %% "someLibrary" % "1.0.0" % "test"
 }

And in build.sbt:

lazy val rool = (project in file("."))
  .settings(
    lbraryDependencies += someLibrary % Test
  )

Once I removed the % "test" from the Dependencies' val expression the probem was resolved.

like image 3
Ramón J Romero y Vigil Avatar answered Oct 16 '22 04:10

Ramón J Romero y Vigil