I'm trying to build a custom task for building our project in our continuous integration environment. It is a set of steps along the lines of
Note that step 10 should be done if any steps fail, and the message should be customized depending on which step failed, e.g. if step 5 fails it should say "Compilation failed", if step 8 fails it should say how many tests were run and how many failed.
To make things extra interesting this is a multi project build, so when running tests and publishing results it should run all the tests and publish aggregated results.
To make things even more interesting, the npm tests, jshint and artifact only really make sense in the webapp
subproject, where the Javascript lives and the web server resides.
I've been looking at sbt-release for inspiration, but I'm stymied on how to take the value produced by one task and use it in the next one, how to run tasks in aggregate and get the produced values (I see a method in Extracted
to run aggregated tasks, but it doesn't give produced values), how to run tasks in a subproject and get the produced value, and how to do the error handling.
So far I've tried two approaches
npmTest.result.value match {
case Inc(inc) =>
println(inc)
case Value(res) => Def.taskDyn {
(executeTests in Test).result.value match {
case Inc(inc) =>
println(inc)
case Value(res) =>
println(res)
}
}
The problem with the above is that executeTests
is always run, even if npmTest
fails. And none of the println
s are executed.
npmTest.result.
flatMap {-
case Inc(inc) =>
task { println(inc) }
case Value(res) =>-
(executeTests in Test).result.
flatMap {
case Inc(inc) =>
task { println(inc) }
case Value(res) =>
task { println(res) }
}
}
This one doesn't compile because (executeTasks in Test)...
produces an Initialize[Task[Unit]]
value and a Task[Unit]
is required.
Is there a way to accomplish this with sbt?
By default, sbt executes tasks in parallel (subject to the ordering constraints already described) in an effort to utilize all available processors. Also by default, each test class is mapped to its own task to enable executing tests in parallel.
You use sbt compile && sbt package to compile and package the classes of your application (which by the way does not have to use Spark whatsoever). That jar file does not include the dependencies of your application so any dependencies have to be provided in some other way (and makes deployment a bit tricker).
Library dependencies can be added in two ways: unmanaged dependencies are jars dropped into the lib directory. managed dependencies are configured in the build definition and downloaded automatically from repositories.
I found a solution that allows you to use good old flatMap
and map
to compose tasks.
sealed abstract class Step[A] {
def run: Def.Initialize[Task[Result[A]]]
def map[B](f: A => B): Step[B]
def flatMap[B](f: A => Step[B]): Step[B]
}
object Step {
val thisProjectRef = settingKey(Keys.thisProjectRef)
val clean = taskKey(Keys.clean)
val compile = taskKey(Keys.compile.in(Compile))
val assembly = taskKey(sbtassembly.AssemblyPlugin.autoImport.assembly)
private[this] def apply[A](task: Def.Initialize[Task[Result[A]]]): Step[A] =
new Step[A] {
val run = task
def map[B](f: A => B): Step[B] =
apply[B](Def.taskDyn {
run.value match {
case Inc(inc) => Def.task(Inc(inc): Result[B])
case Value(a) => Def.task(Value(f(a)))
}
})
def flatMap[B](f: A => Step[B]): Step[B] =
apply[B](Def.taskDyn {
run.value match {
case Inc(inc) => Def.task(Inc(inc): Result[B])
case Value(a) => Def.task(f(a).run.value)
}
})
}
def task[A](t: Def.Initialize[Task[A]]): Step[A] =
apply(t.result)
def taskKey[A](t: TaskKey[A]): Step[A] =
apply(Def.task(t.result.value))
def settingKey[A](s: SettingKey[A]): Step[A] =
apply(Def.task(s.value).result)
}
Then you can define your tasks as
rainicornPublish <<= {
val result =
for {
ass <- Step.assembly
juri <- uploadAssemblyTask(ass)
to <- runAllTests
_ <- finish(ass, juri, to)
} yield (ass, to)
Def.task(result.run.value match {
case Inc(inc) => throw new RainicornException(None)
case Value(v) => v
})
}
And each task will happen in sequence, just as you would expect.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With