Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload local file on parameter ZipFile AWS::Lambda::Function

I've a CloudFormation template with AWS::Lambda::Function resource, and I'm trying to upload a local zip file as code, but It's not uploading. The Lambda function is created without code files.

    Resources:
  mastertestingLambdaDataDigestor:
    Properties:
      Code:
        ZipFile: fileb:///home/dariobenitez/Proyectos/dataflow/templates/lambda_template.zip
      FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
      Handler: handler.kinesis_to_dynamodb
      Role: SOMEROLE
      Runtime: python3.6
    Type: AWS::Lambda::Function

The zip file path parameters works when I'm trying to deploy the same function using the CLI. Any idea?

Thanks a lot!

like image 941
Edgar Benítez Avatar asked Feb 26 '19 00:02

Edgar Benítez


2 Answers

You can't specify a file path there. You must put in the function code itself. It is limited to 4096 bytes. If your code is bigger, you need to upload it to S3 first and use S3Bucket and S3Key.

Example:

mastertestingLambdaDataDigestor:
  Properties:
    Code:
      ZipFile: >
        def handler(event, context):
          pass
    FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
    Handler: handler.kinesis_to_dynamodb
    Role: SOMEROLE
    Runtime: python3.6
  Type: AWS::Lambda::Function

Another option is using aws cloudformation package. It will upload the zip file for you and transform your template to one with the correct paths. For this you'll have to put the zip file path directly in Code. For example:

Resources:
  mastertestingLambdaDataDigestor:
    Properties:
      Code: /home/dariobenitez/Proyectos/dataflow/templates/lambda_template.zip
      FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
      Handler: handler.kinesis_to_dynamodb
      Role: SOMEROLE
      Runtime: python3.6
    Type: AWS::Lambda::Function

Then run:

aws cloudformation package --template-file my-template.yaml --s3-bucket my-bucket

It should output something like:

Resources:
  mastertestingLambdaDataDigestor:
    Properties:
      Code:
        S3Bucket: my-bucket
        S3Key: fjklsdu903490f349034g
      FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
      Handler: handler.kinesis_to_dynamodb
      Role: SOMEROLE
      Runtime: python3.6
    Type: AWS::Lambda::Function

You should then use this template to deploy your stack.

like image 133
kichik Avatar answered Dec 23 '22 14:12

kichik


Following pattern works for me:

  1. Put some dummy code in the lambda definition

    Properties: Code: ZipFile: > def handler(event, context): pass

  2. Use Cloudformation for packaging

    $ aws cloudformation package --template-file\ /src/apigateway/cf.yml --s3-bucket <SOME_BUCKET>\ --output-template-file packaged-template.json

  3. Update lambda code

aws lambda update-function-code \ --function-name my-function \ --zip-file fileb://my-function.zip

  1. As Cloudformation never cares about the code in the lambda function. So even when we redeploy the Cloudformation, your lambda code will still remain the same as you have updated using CLI.

If we just want to upload the lambda code and keep our Cloudformation updated with the references of the upload, we can use Cloudformation package command.

  1. Put something like this in the Cloudformation

    Properties: Code:

  2. And run

    $ aws cloudformation package --template-file\ /src/apigateway/cf.yml --s3-bucket <SOME_BUCKET>\ --output-template-file packaged-template.json

  3. That updates Cloudformaion

    $ grep S3 packaged-template.json -B 2 Properties: Code: S3Bucket: <SOME_BUCKET> S3Key: 77807d1ae2c2e590e0b928ac579c3aee

like image 24
samtoddler Avatar answered Dec 23 '22 15:12

samtoddler