I am using package uploading zipped file like
frameworkVersion: "=1.27.3"
service: recipes
provider:
name: aws
endpointType: REGIONAL
runtime: python3.6
stage: dev
region: eu-central-1
memorySize: 512
deploymentBucket:
name: dfki-meta
versionFunctions: false
stackTags:
Project: DFKIAPP
# Allows updates to all resources except deleting/replacing EC2 instances
stackPolicy:
- Effect: Allow
Principal: "*"
Action: "Update:*"
Resource: "*"
- Effect: Deny
Principal: "*"
Action:
- Update: Replace
- Update: Delete
Resource: "*"
Condition:
StringEquals:
ResourceType:
- AWS::EC2::Instance
# Access to RDS and S3 Bucket
iamRoleStatements:
- Effect: "Allow"
Action: "s3:ListBucket"
Resource: "*"
package:
individually: true
functions:
# get_recipes:
# handler: handler.get_recipes
# module: recipes_crud
# package:
# individually: true
# timeout: 30
# events:
# - http:
# path: recipes
# method: get
# request:
# parameters:
# querystring:
# persona: true
get_recommendation:
handler: handler.get_recommendation
module: recipes_ml
package:
artifact: zipped_dir.zip
timeout: 30
events:
- http:
path: recipes/{id}
method: get
request:
parameters:
paths:
id: true
querystring:
schaerfe_def: true
saettig_def: true
erfahrung_def: true
schaerfe_wunsch: true
saettig_wunsch: true
erfahrung_wunsch: true
gericht_wunsch: true
stimmung_wunsch: true
Can not understand this error, isn't 52.18 under 69905067 bytes ?
(node:50928) ExperimentalWarning: The fs.promises API is experimental
Serverless: Packaging function: get_recommendation...
Serverless: Uploading function: get_recommendation (52.18 MB)...
Serverless Error ---------------------------------------
Request must be smaller than 69905067 bytes for the UpdateFunctionCode operation
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information -----------------------------
OS: darwin
Node Version: 10.1.0
Serverless Version: 1.27.3
The package size should be lower than 50MB according to the docs https://docs.aws.amazon.com/lambda/latest/dg/limits.html
from this blog post
The 20 MB addition presumably is there there to account for request overhead involved with the AWS API (e.g. base64 encoding of the zip file data). So far the 50 MB limit holds true-ish. But, we’re not defeated yet.
This seems to be an issue only while uploading individual lambda function using serverless but if you don't give --function parameter and deploy full stack then it works fine!!!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With