I have a trivial node.js docker app.
I am able to run it successfully, however the code reloading inside the container does not work despite mounting the volume as described in the docker-compose docs.
Directory layout:
my-test-app
| docker-compose.yml
| Dockerfile
| index.js
| package.json
Dockerfile:
FROM mhart/alpine-node:8
WORKDIR /app
COPY . .
EXPOSE 5000
CMD ["node", "index.js"]
docker-compose.yml:
version: '3'
services:
node-app:
build: .
ports:
- "5000:5000"
volumes:
- .:/app
redis:
image: "redis:alpine"
index.js:
const http = require('http');
const server = http.createServer((req, res) => {
res.end("hello world");
});
server.listen(5000);
package.json:
{
"name": "my-test-app",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
},
"author": "",
"license": "MIT"
}
I have also tried the following with same result (app runs, not able to live reload code)
alpine-node
. ./app
folder inside my-test-app
and updating the COPY
inside Dockerfile
and volumes
inside docker-compose.yml
accordingly.The easiest way to deploy a Dockerized application on a remote server is to transfer the application's image with docker pull and then use docker run . This runs the application in a container similar to how you'd do it in your development environment.
I have seen the phrase "live reload" used to apply to two different types of reloading:
Based on your question, I think you're referring to the first type, so the answer that follows addresses that.
The problem here is one of context.
Remember that the docker container is isolated from your host - specifically, the processes running in the container are distinct from (and generally cannot interact with) processes running on the host. In your case, you have chosen to mount a host directory in the container, but that's just the filesystem, not the processes.
Think through what your Docker image does when you instantiate a new container: it runs node index.js
in the WORKDIR
. Where is the code to stop it and restart it when the code changes? Presumably it's running in a process on the host. This means that it cannot touch the node process running in the container (because it's isolated).
Now, you haven't mentioned what method you're using to handle the live reloading, but that shouldn't make too much of a difference. They all basically work the same way: on a change to the application code, kill the existing process and start a new one.
To solve this, you have two options:
For the first, you could follow @MarkS's suggestion and use nodemon
. This should be as simple as replacing
CMD ["node", "index.js"]
in your Dockerfile with
CMD ["nodemon", "index.js"]
provided, of course, that you have nodemon
properly installed in the image.
The alternative, and this is what I do, is to run the code on the host outside the Docker environment during development, and then package it up in an image at deployment. This solves two problems:
Remember that apps running in Docker are run as root
. This means that if your app creates files, they're going to be owned by root
. I tried developing in a Docker environment, but got frustrated by problems where, for example, I wanted to delete a file created by the app and had to sudo
(sign in as root
) just to clean up stuff.
The only difference between what I have and what you have is that I don't have the COPY directive in my Dockerfile. Try removing that and see if it works.
I also use nodemon to automatically restart the node server when the code changes:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With