Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to run command before data.archive_file zips folder in Terraform?

Tags:

terraform

I try to implement aws lambda function using terraform.

I simply have null_resource that have local provisioner and resource.archive_file that zips source code after all preparation is done.

resource "null_resource" "deps" {

  triggers = {
    package_json = "${base64sha256(file("${path.module}/src/package.json"))}"
  }

  provisioner "local-exec" {
    command = "cd ${path.module}/src && npm install"
  }
}

resource "archive_file" "function" {
    type = "zip"
    source_dir = "${path.module}/src"
    output_path = "${path.module}/function.zip"

    depends_on = [ "null_resource.deps" ]
}

Recent changes to Terraform deprecated resource.archive_file, so data.archive_file should be used instead. Unfortunately, data executes before resources, and so local provisioner from dependent resource is called way after zip is created. So code bellow does not produce warning any more, however not working at all.

resource "null_resource" "deps" {

  triggers = {
    package_json = "${base64sha256(file("${path.module}/src/package.json"))}"
  }

  provisioner "local-exec" {
    command = "cd ${path.module}/src && npm install"
  }
}

data "archive_file" "function" {
    type = "zip"
    source_dir = "${path.module}/src"
    output_path = "${path.module}/function.zip"

    depends_on = [ "null_resource.deps" ]
}

Am I missing something? What is correct way to do this with recent versions.

Terraform: v0.7.11 OS: Win10

like image 558
Mike Chaliy Avatar asked Nov 22 '16 14:11

Mike Chaliy


People also ask

Can terraform zip files?

Terraform is such a versatile tool and supports many features that are always fun to discover. Today we are going to learn about the Archive provider and the File provisioner. The Archive provider allows you to create a zip file of a single file, a directory, or content generated withing your Terraform template.

What is archive_ file in terraform?

Generates an archive from content, a file, or directory of files.


2 Answers

Turns out there is an issue with the way Terraform core handles depends_on for data resources. There are a couple of issues reported, one in the archive provider and another in the core.

The following workaround is listed in the archive provider issue. Note that it uses a data.null_data_source to sit between the null_resource and data.archive_file which makes it an explicit dependency as opposed to an implicit dependency with depends_on.

resource "null_resource" "lambda_exporter" {
  # (some local-exec provisioner blocks, presumably...)

  triggers = {
    index = "${base64sha256(file("${path.module}/lambda-files/index.js"))}"
  }
}

data "null_data_source" "wait_for_lambda_exporter" {
  inputs = {
    # This ensures that this data resource will not be evaluated until
    # after the null_resource has been created.
    lambda_exporter_id = "${null_resource.lambda_exporter.id}"

    # This value gives us something to implicitly depend on
    # in the archive_file below.
    source_dir = "${path.module}/lambda-files/"
  }
}

data "archive_file" "lambda_exporter" {
  output_path = "${path.module}/lambda-files.zip"
  source_dir  = "${data.null_data_source.wait_for_lambda_exporter.outputs["source_dir"]}"
  type        = "zip"
}
like image 181
Yep_It's_Me Avatar answered Sep 20 '22 15:09

Yep_It's_Me


There is a new data source in Terraform 0.8, external that allows you to run external commands and extract output. See data.external

The data source should only be used for the retrieval of some depedency value, not the execution of the npm install, you should still do that via the null_resource. Since this is a Terraform data source, it should not have any side effects (although you may need some in this case, not sure).

So basically, null_resource does the dependencies, data.external grabs some value that you can depend on for the archive (directory path for example), then data.archive_file performs the archiving.

This would probably work best with a pseudo random directory name potentially to make dirty checks work a little cleaner.

like image 41
Paul Tyng Avatar answered Sep 17 '22 15:09

Paul Tyng