I'm working on a project using AWS Lambda with Node.js. We use Docker containers for our development environment.
Our current setup spins up AWS SAM local on port :3000. It runs start-api and mounts the functions in my template.yml file. I test these functions using postman to send JSON to the mounted API endpoint like so: http://127.0.0.1:3000/foo
The Docker setup also spins up a separate Node.js instance on :4000.
I'm able to test the Lambda stuff locally as described above. However, I want to activate debugging so that I can step through the function and inspect variables as opposed to using console.log()
. I can't figure out how to edit the Dockerfile / docker-compose.yml to make that happen.
Here is my docker-compose file:
version: '3'
services:
web:
build: ./web
container_name: someapp
command: npm run dev
volumes:
- ./web:/usr/app/
- /usr/app/node_modules
ports:
- "4000:4000"
environment:
DATABASE_URL: mongo://someapp:[email protected]:37017,10.10.62.205:37018,10.10.62.205:37019/somedb
sam:
build: serverless/.
container_name: samlocal
command: sam local start-api --host 0.0.0.0
environment:
COMPOSE_CONVERT_WINDOWS_PATHS: 1
SAM_DOCKER_VOLUME_BASEDIR: ${CURRENT_DIRECTORY}/serverless
DATABASE_URL: mongo://someapp:[email protected]:37017,10.10.62.205:37018,10.10.62.205:37019/somedb
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./serverless:/var/opt
ports:
- "3000:3000"
Here is the Dockerfile for SAM, which is in a directory called "serverless":
FROM alpine:3.6
ENV VERSION=0.2.2
RUN apk add --no-cache curl && \
curl -sSLO https://github.com/awslabs/aws-sam-local/releases/download/v${VERSION}/sam_${VERSION}_linux_386.tar.gz && \
tar -C /usr/local/bin -zxvf /sam_${VERSION}_linux_386.tar.gz && \
apk del curl && \
rm -f /sam_${VERSION}_linux_386.tar.gz
# awscli for "sam package" and "sam deploy"
RUN apk add --no-cache py-pip && pip install awscli
WORKDIR /var/opt
EXPOSE 3000
I've tried various permutations of adding the --d flag to the "sam" service in RUN directive in docker-compose.yml. For example: sam local start-api --host 0.0.0.0 --d 8080
. And then I try to change the port mapping to expose it. However, I can't figure out how to get the port mapping to work. As soon as I hit the endpoint I get port errors.
I'm still getting up to speed on docker / docker-compose and a total nube when it comes to the Lambda stuff, so sorry if the question is silly.
TIA!
I have determined that my initial approach was incorrect.
The benefit of making SAM Local part of my docker-compose setup --or so I thought-- was that I was saving my teammates the trouble of installing it locally on their machines. And starting it in this way:
sam local start-api --host 0.0.0.0
seemed to confer the additional convenience of "automagically" mounting all of the API endpoints.
However, this does not afford the user the flexibility to toggle debugging on and off. Nor does it allow the user to switch between POST-ing input with Postman or loading the input data from a file.
So here is the way I'm doing it now:
$ npm install aws-sam-local --g
$ cd path/to/my/template.yml
To use Postman:
$ sam local start-api
or Postman plus debugger:
$sam local start-api -d 5858
To bypass Postman and simply read the input from a file:
$ sam local invoke "NameOfResource" -e ./path/to/file.json
or to do the same with a debugger:
$ sam local invoke "NameOfResource" -e ./path/to/file.json -d 5858
NOTE: in the above example, "NameOfResource"
must be a string and it must match the resource name listed in template.yml (in case it is different from the actual function name in the source code).
Doing it this way I am able to connect a remote Node.js debugger in WebStorm and set breakpoints. I'm also able to connect a Visual Studio Code debugger. However, Visual Studio Code seems to ignore my breakpoints, forcing me to use debugger;
statements. This is unfortunate since my teammates all use Visual Studio Code and WebStorm is not free. If anyone knows how to get around the Visual Studio issue, holler!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With