I intend to generate resources based on templates and sbt settings. There should be different sbt settings for Compile
and Test
. The templates are in project/resources/hdfs/*.xml
. From the sbt-dev gitter chat, I've had the recommendation to use Setting
and inConfig
, but I couldn't get it to work.
Code so far:
val hdfsNamenode = settingKey[String]("Namenode for the HDFS access")
def genHdfsConfigs: Setting[_] =
hdfsNamenode := {
resourceGenerators += Def.task {
val files = ((baseDirectory.value / "project" / "resources" / "hdfs" ) * "*.xml").get
files.foreach({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", hdfsNamenode.value)
IO.write(resourceManaged.value / hdfsTemplate.getName, config)
})
files.toSeq
}.taskValue
}
hdfsNamenode in Test := "localhost"
hdfsNamenode in Compile := "172.31.32.228"
def allHdfsNamenodeConfigs: Seq[Setting[_]] =
inConfig(Compile)(Seq(hdfsNamenode)) ++ inConfig(Test)(Seq(hdfsNamenode))
Errors:
[error] found : sbt.Def.Setting[Seq[sbt.Task[Seq[java.io.File]]]]
[error] required: String
[error] resourceGenerators += Def.task {
[error] ^
[error] found : sbt.SettingKey[String]
[error] required: sbt.Def.Setting[_]
[error] inConfig(Compile)(Seq(hdfsNamenode)) ++ inConfig(Test)(Seq(hdfsNamenode))
[error]
Next iteration, I don't know how to get the hdfsNameNode
in scope Compile
/Test
instead of unscoped.
lazy val hdfsNameNode = settingKey[String]("Namenode for the HDFS access")
val genHdfsConfig = Def.task {
val files = ((baseDirectory.value / "project" / "templates" / "resources" / "hdfs" ) * "*.xml").get
files.map({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", hdfsNameNode.value)
val outputPath = resourceManaged.value / hdfsTemplate.getName
IO.write(outputPath, config)
outputPath
})
}
Seq(
hdfsNameNode := "undefined", // Only this one is accepted
hdfsNameNode in Test := "localhost",
hdfsNameNode in Compile := "172.31.32.228",
resourceGenerators in Compile += genHdfsConfig.taskValue,
resourceGenerators in Test += genHdfsConfig.taskValue
)
Current (ugly) solution:
val hdfsTestNameNode = "localhost"
val hdfsMainNameNode = "172.31.32.228"
val hdfsNameNode = settingKey[String]("Namenode for the HDFS access")
val genTestHdfsConfig = Def.task {
val files = ((baseDirectory.value / "project" / "templates" / "resources" / "hdfs" ) * "*.xml").get
files.map({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", (hdfsNameNode in Test).value)
val outputPath = resourceManaged.value / hdfsTemplate.getName
IO.write(outputPath, config)
outputPath
})
}
val genCompileHdfsConfig = Def.task {
val files = ((baseDirectory.value / "project" / "templates" / "resources" / "hdfs" ) * "*.xml").get
files.map({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", (hdfsNameNode in Compile).value)
val outputPath = resourceManaged.value / hdfsTemplate.getName
IO.write(outputPath, config)
outputPath
})
}
Seq(
hdfsNameNode in Test := hdfsTestNameNode,
hdfsNameNode in Compile := hdfsMainNameNode,
resourceGenerators in Compile += genCompileHdfsConfig.taskValue,
resourceGenerators in Test += genTestHdfsConfig.taskValue
)
There are three built-in build lifecycles: default, clean and site. The default lifecycle handles your project deployment, the clean lifecycle handles project cleaning, while the site lifecycle handles the creation of your project's web site.
Generate resources to be included in the package. Copy and process the resources into the destination directory, ready for packaging phase. Compile the source code of the project. Post-process the generated files from compilation, for example to do bytecode enhancement/optimization on Java classes.
The “maven-source” plugin is used to pack your source code and deploy along with your project. This is extremely useful, for developers who use your deployed project and also want to attach your source code for debugging.
mvn compile: Compiles source code of the project. mvn test-compile: Compiles the test source code. mvn test: Runs tests for the project. mvn package: Creates JAR or WAR file for the project to convert it into a distributable format.
Just pass the Configuration as an argument when you define your resource generator.
val hdfsTestNameNode = "localhost"
val hdfsMainNameNode = "172.31.32.228"
val hdfsNameNode = settingKey[String]("Namenode for the HDFS access")
def genHdfsConfig(cfg: Configuration) = Def.task {
val files = ((baseDirectory.value / "project" / "templates" / "resources" / "hdfs" ) * "*.xml").get
files.map({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", (hdfsNameNode in cfg).value)
val outputPath = (resourceManaged in cfg).value / hdfsTemplate.getName
IO.write(outputPath, config)
outputPath
})
}
Seq(
hdfsNameNode in Test := hdfsTestNameNode,
hdfsNameNode in Compile := hdfsMainNameNode,
resourceGenerators in Compile += genHdfsConfig(Compile).taskValue,
resourceGenerators in Test += genHdfsConfig(Test).taskValue
)
Don't want to specify the scope twice? Just refactor a bit more.
val hdfsTestNameNode = "localhost"
val hdfsMainNameNode = "172.31.32.228"
val hdfsNameNode = settingKey[String]("Namenode for the HDFS access")
def addHdfsConfigGenerator(cfg: Configuration) = {
inConfig(cfg) {
val hdfsConfigGenerator = Def.task {
val files = ((baseDirectory.value / "project" / "templates" / "resources" / "hdfs" ) * "*.xml").get
files.map({ hdfsTemplate =>
val config = IO.read(hdfsTemplate).replace("{{namenode}}", (hdfsNameNode in cfg).value)
val outputPath = (resourceManaged in cfg).value / hdfsTemplate.getName
IO.write(outputPath, config)
outputPath
})
}
resourceGenerators += hdfsConfigGenerator.taskValue
} last
}
Seq(
hdfsNameNode in Test := hdfsTestNameNode,
hdfsNameNode in Compile := hdfsMainNameNode,
addHdfsConfigGenerator(Compile),
addHdfsConfigGenerator(Test)
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With