Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Uploading Multiple files in AWS S3 from terraform

I want to upload multiple files to AWS S3 from a specific folder in my local device. I am running into the following error.

enter image description here

Here is my terraform code.

resource "aws_s3_bucket" "testbucket" {
    bucket = "test-terraform-pawan-1"
    acl = "private"

    tags = {
        Name  = "test-terraform"
        Environment = "test"
    }
}

resource "aws_s3_bucket_object" "uploadfile" {
  bucket = "test-terraform-pawan-1"
  key     = "index.html"
  source = "/home/pawan/Documents/Projects/"

}

How can I solve this problem?

like image 723
pawan19 Avatar asked Aug 12 '19 05:08

pawan19


People also ask

How many ways you can upload data to S3?

There are three ways in which you can upload a file to amazon S3.


3 Answers

As of Terraform 0.12.8, you can use the fileset function to get a list of files for a given path and pattern. Combined with for_each, you should be able to upload every file as its own aws_s3_bucket_object:

resource "aws_s3_bucket_object" "dist" {
  for_each = fileset("/home/pawan/Documents/Projects/", "*")

  bucket = "test-terraform-pawan-1"
  key    = each.value
  source = "/home/pawan/Documents/Projects/${each.value}"
  # etag makes the file update when it changes; see https://stackoverflow.com/questions/56107258/terraform-upload-file-to-s3-on-every-apply
  etag   = filemd5("/home/pawan/Documents/Projects/${each.value}")
}

See terraform-providers/terraform-provider-aws : aws_s3_bucket_object: support for directory uploads #3020 on GitHub.

Note: This does not set metadata like content_type, and as far as I can tell there is no built-in way for Terraform to infer the content type of a file. This metadata is important for things like HTTP access from the browser working correctly. If that's important to you, you should look into specifying each file manually instead of trying to automatically grab everything out of a folder.

like image 50
meustrus Avatar answered Sep 25 '22 05:09

meustrus


You are trying to upload a directory, whereas Terraform expects a single file in the source field. It is not yet supported to upload a folder to an S3 bucket.

However, you can invoke awscli commands using null_resource provisioner, as suggested here.

resource "null_resource" "remove_and_upload_to_s3" {
  provisioner "local-exec" {
    command = "aws s3 sync ${path.module}/s3Contents s3://${aws_s3_bucket.site.id}"
  }
}
like image 26
Vikyol Avatar answered Sep 24 '22 05:09

Vikyol


Since June 9, 2020, terraform has a built-in way to infer the content type (and a few other attributes) of a file which you may need as you upload to a S3 bucket

HCL format:

module "template_files" {
  source = "hashicorp/dir/template"

  base_dir = "${path.module}/src"
  template_vars = {
    # Pass in any values that you wish to use in your templates.
    vpc_id = "vpc-abc123"
  }
}

resource "aws_s3_bucket_object" "static_files" {
  for_each = module.template_files.files

  bucket       = "example"
  key          = each.key
  content_type = each.value.content_type

  # The template_files module guarantees that only one of these two attributes
  # will be set for each file, depending on whether it is an in-memory template
  # rendering result or a static file on disk.
  source  = each.value.source_path
  content = each.value.content

  # Unless the bucket has encryption enabled, the ETag of each object is an
  # MD5 hash of that object.
  etag = each.value.digests.md5
}

JSON format:

{
"resource": {
  "aws_s3_bucket_object": {
    "static_files": {
      "for_each": "${module.template_files.files}"
      #...
      }}}}
#...
}

Source: https://registry.terraform.io/modules/hashicorp/dir/template/latest

like image 12
Flair Avatar answered Sep 24 '22 05:09

Flair