Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Terraform - Multiple aws_s3_bucket_notification triggers on the same bucket

I need to create a trigger for an S3 bucket. We use the following to create the trigger:

resource "aws_s3_bucket_notification" "bucket_notification" {
  bucket = var.aws_s3_bucket_id

  lambda_function {
    lambda_function_arn = var.lambda_function_arn
    events              = ["s3:ObjectCreated:Put"]
    filter_prefix       = var.filter_prefix
    filter_suffix       = var.filter_suffix
  }
}

This works fine when the bucket does not already have a trigger which was the case for all environments apart from production. When we deployed production we saw that the trigger which was already present on the bucket got deleted. We need both triggers. I was able to add another trigger manually, for example a PUT event trigger by just changing the prefix, however when I do it from Terraform the previous always gets deleted. Is there anything I am missing?

like image 680
Bogdan Pastiu Avatar asked Mar 03 '20 08:03

Bogdan Pastiu


1 Answers

The aws_s3_bucket_notification resource documentation mentions this at the top:

NOTE: S3 Buckets only support a single notification configuration. Declaring multiple aws_s3_bucket_notification resources to the same S3 Bucket will cause a perpetual difference in configuration. See the example "Trigger multiple Lambda functions" for an option.

Their example shows how this should be done by adding multiple lambda_function blocks in the aws_s3_bucket_notification resource:

resource "aws_iam_role" "iam_for_lambda" {
  name = "iam_for_lambda"

  assume_role_policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": "sts:AssumeRole",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Effect": "Allow"
    }
  ]
}
EOF
}

resource "aws_lambda_permission" "allow_bucket1" {
  statement_id  = "AllowExecutionFromS3Bucket1"
  action        = "lambda:InvokeFunction"
  function_name = "${aws_lambda_function.func1.arn}"
  principal     = "s3.amazonaws.com"
  source_arn    = "${aws_s3_bucket.bucket.arn}"
}

resource "aws_lambda_function" "func1" {
  filename      = "your-function1.zip"
  function_name = "example_lambda_name1"
  role          = "${aws_iam_role.iam_for_lambda.arn}"
  handler       = "exports.example"
  runtime       = "go1.x"
}

resource "aws_lambda_permission" "allow_bucket2" {
  statement_id  = "AllowExecutionFromS3Bucket2"
  action        = "lambda:InvokeFunction"
  function_name = "${aws_lambda_function.func2.arn}"
  principal     = "s3.amazonaws.com"
  source_arn    = "${aws_s3_bucket.bucket.arn}"
}

resource "aws_lambda_function" "func2" {
  filename      = "your-function2.zip"
  function_name = "example_lambda_name2"
  role          = "${aws_iam_role.iam_for_lambda.arn}"
  handler       = "exports.example"
}

resource "aws_s3_bucket" "bucket" {
  bucket = "your_bucket_name"
}

resource "aws_s3_bucket_notification" "bucket_notification" {
  bucket = "${aws_s3_bucket.bucket.id}"

  lambda_function {
    lambda_function_arn = "${aws_lambda_function.func1.arn}"
    events              = ["s3:ObjectCreated:*"]
    filter_prefix       = "AWSLogs/"
    filter_suffix       = ".log"
  }

  lambda_function {
    lambda_function_arn = "${aws_lambda_function.func2.arn}"
    events              = ["s3:ObjectCreated:*"]
    filter_prefix       = "OtherLogs/"
    filter_suffix       = ".log"
  }
}
like image 189
ydaetskcoR Avatar answered Oct 31 '22 07:10

ydaetskcoR