Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AWS Lambda Code in S3 Bucket not updating

I am using cloudformation to create my lambda function with the code in a S3Bucket with versioning enabled.

"MYLAMBDA": {
      "Type": "AWS::Lambda::Function",
      "Properties": {
        "FunctionName": {
          "Fn::Sub": "My-Lambda-${StageName}"
        },
        "Code": {
          "S3Bucket": {
            "Fn::Sub": "${S3BucketName}"
          },
          "S3Key": {
            "Fn::Sub": "${artifact}.zip"
          },
          "S3ObjectVersion": "1e8Oasedk6sDZu6y01tioj8X._tAl3N"
        },
        "Handler": "streams.lambda_handler",
        "Runtime": "python3.6",
        "Timeout": "300",
        "MemorySize": "512",
        "Role": {
          "Fn::GetAtt": [
            "LambdaExecutionRole",
            "Arn"
          ]
        }
      }
    }

The lambda function gets created successfully. When i copy a new artifact zip file to the s3bucket, a new version of the file gets created with the new version "S3ObjectVersion" string. But the lambda function code is still using the older version.

The documentation of aws cloudformation clearly says the following

To update a Lambda function whose source code is in an Amazon S3 bucket, you must trigger an update by updating the S3Bucket, S3Key, or S3ObjectVersion property. Updating the source code alone doesn't update the function.

Is there an additional trigger event, i need to create to get the code updated?

like image 650
Kathir Avatar asked Nov 22 '17 03:11

Kathir


People also ask

Is Lambda code stored in S3?

The Lambda service stores your function code in an internal S3 bucket that's private to your account. Each AWS account is allocated 75 GB of storage in each Region.


2 Answers

In case anyone is running into this similar issue, I have figured out a way in my case. I use Terraform + Jenkins to create my lambda functions through s3 bucket. In the beginning, I can create the functions but it won't update once it created. I verified my zip files in s3 is updated. It took me some time to figure out that I need do one of following two changes.

solution 1: Giving a new object key when load the new zip file. In my terraform I add the git commit id as part of the s3 key.

resource "aws_s3_bucket_object" "lambda-abc-package" {
  bucket = "${aws_s3_bucket.abc-bucket.id}"
  key    = "${var.lambda_ecs_task_runner_bucket_key}_${var.git_commit_id}.zip"
  source = "../${var.lambda_ecs_task_runner_bucket_key}.zip"
}

solution 2: add source_code_hash in lambda part.

resource "aws_lambda_function" "abc-ecs-task-runner" {
  s3_bucket         = "${var.bucket_name}"
  s3_key            = "${aws_s3_bucket_object.lambda-ecstaskrunner-package.key}"
  function_name     = "abc-ecs-task-runner"
  role              = "${aws_iam_role.AbcEcsTaskRunnerRole.arn}"
  handler           = "index.handler"
  memory_size       = "128"
  runtime           = "nodejs6.10"
  timeout           = "300"
  source_code_hash  = "${base64sha256(file("../${var.lambda_ecs_task_runner_bucket_key}.zip"))}"

So do either one should work. Also when checking lambda code, refresh the URL from the browser won't work. Need go back Functions and open that function again.

Hope this helps.

like image 138
L.T. Avatar answered Sep 19 '22 11:09

L.T.


I also faced the same issue , my code was in Archive.zip in S3 bucket , when I uploaded a new Archive.zip , lambda was not responding according to new code .

Solution was to again paste the link of S3 location of Archive.zip in lambda's function code section and Save it again.

How I figured out lambda was not taking new code?

Go to your lambda function --> Actions --> Export Function --> Download Deployment Package and check if the code is actually the code that you've recently uploaded to S3 .

like image 24
charany1 Avatar answered Sep 22 '22 11:09

charany1