I wanted to follow these tips
and just redeploy my function, as the serverless.yml
had not been changed.
However, it just hangs on the Serverless: Uploading function
stage. Forever, apparently.
The whole deploy (with sls deploy
) works, though slowly.
How can debug this, as there is apparently no error message?
EDIT
When I use sls deploy
my project takes about 4 min and 15s to deploy.
It seems rather long to me, so I thought I would use sls deploy function -f myFunction
instead, which is supposed to be much faster.
However, when I try sls deploy function -f myFunction
, it seems to just hang forever on Serverless: Uploading function: myFunction
.
I have no idea how to debug that.
It seems using 'verbose', with Serverless: Uploading function: myFunction --verbose
does not make a difference, the messages returned are the same.
I will try to wait and see if, eventually, the function deploy completes...
Well, I waited, and it doesn't: after about 8 min 30s I get the following error message:
Serverless Error --------------------------------------- Connection timed out after 120000ms Get Support -------------------------------------------- Docs: docs.serverless.com Bugs: github.com/serverless/serverless/issues Forums: forum.serverless.com Chat: gitter.im/serverless/serverless Your Environment Information ----------------------------- OS: linux Node Version: 7.10.0 Serverless Version: 1.20.2
Another oddity: when hanging, it reads:
Serverless: Uploading function: myFunction (12.05 MB)...
But the function itself is just 3.2 kB, and does not include any packages.
When I use sls deploy
, the size displayed is the same:
Serverless: Uploading service .zip file to S3 (12.05 MB)...
What could be wrong with my function deploy?
EDIT 2
As @dashmug hinted, there is a config issue in serverless.yml
.
In the functions
dir of my serverless project, I would like to have a common package.json
and node_modules
. Then each function could import modules as needed.
I tried to follow the official guide.
My serverless.yml
is like so:
functions:
myFunction:
package:
exclude:
- 'functions/node_modules/**'
- '!functions/node_modules/module1_I_want_to_include/**'
- '!functions/node_modules/module2_I_want_to_include/**'
Now I get, with sls deploy
:
Serverless: Uploading service .zip file to S3 (31.02 MB)...
and the function works :)
However, with sls deploy function -f myFunction
, I get:
Serverless: Uploading function: dispatch (1.65 MB)...
It does upload in a reasonable time, but the function now gives the following error:
Unable to import module 'functions/myFunction': Error
The sls deploy function command deploys an individual function without AWS CloudFormation. This command simply swaps out the zip file that your CloudFormation stack is pointing toward. This is a much faster way of deploying changes in code.
Things I would look at:
Try comparing what happens between the two:
$ SLS_DEBUG=true sls deploy --verbose
and
$ SLS_DEBUG=true sls deploy function -f myFunction --verbose
Check your serverless config (packaging, etc.) against your project structure. One red flag is that the function deploy is as big as the service deploy. This could be a misconfiguration problem.
Use serverless package
to see how the package(s) are zipped. It can provide some clues.
Are you using any plugins which may have altered the way your package is created?
How many node_modules
directory do you have? Do you have only one for the entire service or one for each function?
You can make the deploy process more verbose by passing the --verbose
argument to the deploy function.
Either sls deploy --verbose
or sls deploy -v
will do the trick.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With