Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Docker with yarn

First time trying to get yarn and docker working together. How can I stop yarn from installing the packages every single time I run docker build command?

I've found some solutions like storing node_modules in a temporary directory and then linking it, but with various packages installed I get too many errors to handle. Is there maybe a way to compare my yarn.lock with the one existing inside Docker or any other solutions?

Dockerfile:

FROM node:8.9.1-alpine

COPY package.json yarn.lock /usr/src/
RUN cd /usr/src \
    && yarn install --pure-lockfile

COPY . /usr/src
EXPOSE 3005

With this setup I get a message saying Sending build context to Docker daemon 375.2MB, then the yarn install is run as usual, fetching the packages every single time.

like image 923
mdmb Avatar asked Nov 10 '17 17:11

mdmb


People also ask

How do I run yarn in Docker container?

docker run yarn-demo node -e "console.log ('Hello, World')" The first time your build the container, Yarn fetches npm dependencies for you. After that, Yarn is executed only when you modify your package.json, and it uses cache from previous executions.

Why do I need a docker image to build my projects?

I want a docker image to build my node projects - and use yarn to install the packages as the installation is much faster (and more deterministic) than using npm. One reason why yarn is fast is of course the local yarn cache. So the docker image needs to mount the yarn cache directory when building the projects.

How to run Docker-Compose without script?

I don't know it's bug or not, but docker at first "executes" Dockerfile (until CMD command) after that mount volumes. If you want to run docker-compose without script, create bash file entrypoint.sh, write in it commands yarn install and bundle install and run your docker: sudo docker-compose up -d --build.

How to run source code from a docker container?

So you can Run your code directly from container (as you done in Dockerfile) or Don't copy source code to container and mount code as volume (as you done in docker-compose.yml), and execute yarn install in your system, or create entrypoint script with installation commands: entrypoint: ./entrypoint.sh


2 Answers

Definitely pay attention to docker caching. Basically you want to run the most stable instructions earlier than less stable ones. Instructions which would result in the same image changes as an earlier run should not be rerun (exluding issues with ENV/ARG instructions). But once an instruction does need to be run, all instructions following will be run regardless of what's in the cache.

.dockerignore will also help, but it can be easy for things to slip in. I've adopted the practice of inverting the file by ignoring everything and then specifying exactly what should be copied.

To minimise fetching from the web, I like using the yarn offline cache. This stores the tarballs of installed dependencies and reuses them for future installs. You get the benefit over npm rebuild of fresh installs every time (if the cache has been invalidated). You configure the offline cache with a .yarnrc file which could be in your home directory but for this purpose you keep one in your repo along with a directory to store the tarballs.

You have the option of checking-in the mirror directory to your repo. A typically large npm module install will still only be in the region of 20 megs.

If you use the files below, customised for your needs, and run yarn locally, the yarn-offline-mirror will contain the tarballs needed to install the app.

.dockerignore

*

!yarn-offline-mirror/
!src/
!package.json
!yarn.lock
!.yarnrc

.yarnrc

# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1

yarn-offline-mirror "./yarn-offline-mirror"

Dockerfile

ENV HOME /usr/src/
WORKDIR $HOME

# copy the tarballs
COPY ["yarn-offline-mirror", "$HOME/yarn-offline-mirror/"]

# copy files needed for the install
COPY ["package.json", "yarn.lock", ".yarnrc", "$HOME/"]

# the offline flag will mean that an error is raised if any
# module needs to be fetched remotely. It can be removed to allow
# yarn to fetch any missing modules if that was to happen.
RUN yarn --offline --frozen-lockfile --link-duplicates

# copy the rest.. could be further broken up into multiple instructions
# for cache optimisation
COPY . $HOME

CMD npm start
like image 125
lecstor Avatar answered Sep 19 '22 14:09

lecstor


You should use the docker cache better.

If you have your Dockerfile prepared as follows:

FROM node:carbon

COPY package.json yarn.lock /app
RUN cd /app \
    && yarn install --pure-lockfile

COPY . /app
CMD doStuff

The docker build won't touch the package.json unless it has changed. The next RUN command won't execute unless the cache has been invalidated in the earlier step.

NOTE: keep the node_modules inside your .dockerignore file

like image 22
Stefano Avatar answered Sep 21 '22 14:09

Stefano