Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Docker: Best practice for development and production environment

Suppeose I have a simple node.js app. I can build a container to run the app with a simple Dockerfile like this:

FROM ubuntu:16.04
RUN apt-get update && apt-get install -y nodejs nodejs-legacy npm
COPY . /app
WORKDIR /app
RUN npm install
CMD node index.js

This will copy the source code into the container and I can ship it off to a registry no problem.

But for development I don't want to rebuild the container for every change in my code. So naturally, I use a volume in combination to nodemon. Here's my questions:

  • How do I keep the different configurations? Two dockerfiles? Use compose with two different compose files?
  • The node_nodules folder on my host is different from the one I need in the container (i.e. some packages are installed globally on the host). Can I exclude it from the volume? If so, I need to run npm install after mounting the volume. How do I do this?

So my question is really: How do I keep dev and deploy environments separate. Two Dockerfiles? Two compose-files? Are there any best practices?

like image 535
Gabb0 Avatar asked Aug 10 '16 15:08

Gabb0


People also ask

Is Docker used in development or production?

Docker Compose is an excellent tool for optimizing the process of creating development, testing, staging, and production environments.

Should Docker use for development environment?

Docker helps to ensure that all the developers have access to all the necessary bits and pieces of the software they work on. So if someone adds software dependencies, everyone has them when needed. If it is just one developer, there is no such need. But even in this case, Docker may help, eg.


2 Answers

So the way I handle it is I have 2 Docker files (Dockerfile and Dockerfile.dev).

In the Dockerfile.dev I have:

FROM node:6

# Update the repository
RUN apt-get update

# useful tools if need to ssh in or used by other tools
RUN apt-get install -y curl net-tools jq

# app location
ENV ROOT /usr/src/app

COPY package.json /usr/src/app/

# copy over private npm repo access file
ADD .npmrc /usr/src/app/.npmrc

# set working directory
WORKDIR ${ROOT}

# install packages
RUN npm install

# copy all other files over
COPY . ${ROOT}

# start it up
CMD [ "npm", "run", "start" ]

# what port should I have
EXPOSE 3000

My NPM scripts look like this

"scripts": {
    ....
    "start": "node_modules/.bin/supervisor -e js,json --watch './src/' --no-restart-on error ./index.js",
    "start-production": "node index.js",
    ....
},

You will notice it uses supervisor for start so any changes to any file under src will cause it to restart the server without requiring a restart to docker.

Last is the docker compose.

dev:
  build: .
  dockerfile: Dockerfile.dev
  volumes:
    - "./src:/usr/src/app/src"
    - "./node_modules:/usr/src/node_modules"
  ports:
    - "3000:3000"

prod:
  build: .
  dockerfile: Dockerfile
  ports:
    - "3000:3000"

So you see in a dev mode it loads and mounts the current directory's src folder to the container at /usr/src/app/src and also the node_modules directory to the /usr/src/node_modules.

This makes it so that I can make changes locally and save, the volume will update the container's file, then supervisor will see that change and restart the server.

** Note as it doesn't watch the node_modules folder you have to change another file in the src directory to do the restart **

like image 174
Shawn C. Avatar answered Nov 15 '22 07:11

Shawn C.


Use environment variables. See the documentation Docker env. This is the recommended way, also for use in the production.

like image 32
Piotr Dobrysiak Avatar answered Nov 15 '22 07:11

Piotr Dobrysiak