data "archive_file" "example" {
type = "zip"
output_path = "${local.dest_dir}/hello_upload.zip"
source_file = "${local.src_dir}/hello.py"
source_dir = "${local.src_dir}/pytz"
source_dir = "${local.src_dir}/pytz-2018.5.dist-info"
}
note that hello.py need to import pytz which is not included in Lambda that is why I want to upload the package.
when I run the above terraform I got error: "source_dir": conflicts with source_file. Then How can I upload both my lambda file hello.py and the package pytz which is a directory?
To deploy the multi-file configuration, I just need to start a regular plan, apply or destroy task. Terraform will find all the tf. -files, merge them and then executes. If a variables file should be used, the parameter –var-file is needed to point Terraform to the file.
You will have only one. About the lowest impact... Hard to say. All depends of your variables and your resource dependencies. If you have to search is several file to understand how a resource is created or where a variable is set, multiple file isn't a great idea...
main.tf will contain the main set of configuration for your module. You can also create other configuration files and organize them however makes sense for your project. variables.tf will contain the variable definitions for your module.
I had a similar issue when I wanted to add python lib defined by symbolic links (also known as "symlinks"). The archive provider of terraform is buggy on that case.
I walked around it by using a null_ressouce
to complete the zip archive :
resource "null_resource" "add_my_lib" {
provisioner "local-exec" {
command = "zip -ur ./archive.zip /path/to/the/lib"
}
}
Then don't forget to add a depends_on
attribute to the resource using the archive.
resource "aws_s3_bucket_object" "my_lambda_layer" {
...
depends_on = [null_resource.add_my_lib]
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With