I have a python script that I want to run as a lambda function on AWS. Unfortunately, the package is unzipped bigger than the allowed 250 MB, mainly due to numpy (85mb) and pandas (105mb)
I have already done the following but the size is still too big:
1) Excluded not used folders:
package:
exclude:
- testdata/**
- out/**
- etc/**
2) Zipped the python packages:
custom:
pythonRequirements:
dockerizePip: true
zip: true
If I unzip the zip file generated by serverless package
I find a .requriements.zip
which contains my python packages and then there is also my virtual environment in the .virtualenv/
folder which contains, again, all the python packages. I have tried to exclude the .virtualenv/../lib/python3.6/site-packages/**
folder in serverless.yml
, but then I get an Internal server error when calling the function.
Are there any other parameters to decrease the package size?
The .virtualenv/
directory should not be included in the zip file.
If the directory is located in the same directory as serverless.yml
then it should be added to exlude in the serverless.yml
file, else it gets packaged along with other files:
package:
exclude:
- ...
- .virtualenv/**
include:
- ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With