Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I improve build time of Angular 5 project using Docker?

I’m trying to improve the build time of my automation. Right now it takes 14 mins just to build the front-end.

enter image description here


This is what I got so far

web.dockerfile

### STAGE 1: Build ###
FROM node:9.3.0-alpine as builder

COPY package.json ./

RUN npm set progress=false && npm config set depth 0 && npm cache clean --force

## Storing node modules on a separate layer will prevent unnecessary npm installs at each build
RUN npm i
RUN mkdir /web
RUN cp -R ./node_modules ./web

WORKDIR /web

COPY . .

RUN $(npm bin)/ng build --prod --build-optimizer

### STAGE 2: Setup ###

FROM nginx:1.13.8-alpine

COPY nginx.conf /etc/nginx/nginx.conf
COPY site.conf /etc/nginx/conf.d/default.conf
RUN rm -rf /usr/share/nginx/html/*

COPY --from=builder /web/dist /usr/share/nginx/html/

RUN touch /var/run/nginx.pid && \
  chown -R nginx:nginx /var/run/nginx.pid && \
  chown -R nginx:nginx /var/cache/nginx && \
  chown -R nginx:nginx /usr/share/nginx/html

USER nginx

RUN $(npm bin)/ng build --prod --build-optimizer

This line above is taking so long almost the entire 99% of the build time.


.angular-cli.json

{
  "$schema": "./node_modules/@angular/cli/lib/config/schema.json",
  "project": {
    "name": "web"
  },
  "apps": [{
    "root": "src",
    "outDir": "dist",
    "assets": [
      "assets",
      "favicon.ico"
    ],
    "index": "index.html",
    "main": "main.ts",
    "polyfills": "polyfills.ts",
    "test": "test.ts",
    "tsconfig": "tsconfig.app.json",
    "testTsconfig": "tsconfig.spec.json",
    "prefix": "app",
    "styles": [
      "styles.css",
      "../node_modules/bootstrap/dist/css/bootstrap.min.css",
      "../node_modules/ngx-toastr/toastr.css",
      "../src/assets/css/style.css",
      "../src/assets/css/colors/blue.css"

    ],
    "scripts": [
      "../node_modules/jquery/dist/jquery.min.js",
      "../node_modules/popper.js/dist/umd/popper.min.js",
      "../node_modules/bootstrap/dist/js/bootstrap.min.js",
      "../node_modules/jquery-slimscroll/jquery.slimscroll.min.js",
      "../node_modules/pace-js/pace.min.js"
    ],
    "environmentSource": "environments/environment.ts",
    "environments": {
      "dev": "environments/environment.ts",
      "prod": "environments/environment.prod.ts"
    }
  }],
  "e2e": {
    "protractor": {
      "config": "./protractor.conf.js"
    }
  },
  "lint": [{
      "project": "src/tsconfig.app.json",
      "exclude": "**/node_modules/**"
    },
    {
      "project": "src/tsconfig.spec.json",
      "exclude": "**/node_modules/**"
    },
    {
      "project": "e2e/tsconfig.e2e.json",
      "exclude": "**/node_modules/**"
    }
  ],
  "test": {
    "karma": {
      "config": "./karma.conf.js"
    }
  },
  "defaults": {
    "styleExt": "css",
    "component": {}
  }
}

Environment

DockerCloud connect to my AWS

AWS : EC2 micro


Result

This dockerfile works perfectly and it build success.

But it takes about 14 minutes to build. Is it possible to improve this? Is it because of my instance have too little processor?

like image 554
code-8 Avatar asked Jan 17 '18 19:01

code-8


1 Answers

[TL;DR]

  • Use volumes to store node_modules and .npm
  • Parallelize parts of your process (e.g. tests)
  • Be careful when using relative paths
  • Do not copy your entire project with COPY . . . Relative path issues and possible information leaks.
  • Create a separate image containing only core dependencies for building and testing (e.g. npm, java, chrome-driver, libgconf2).
  • Configure pipelines to use this image
  • Let the CI clone the repo and copy your project into the container for building and testing
  • Archive built files (e.g. dist) and tag based on failure rates
  • Create a new image with just enough things to run your built files.

[LONG VERSION]

There is a good chance that your npm dependencies are being re-downloaded and/or your docker images are being rebuilt for every build you run.

Rather than copying files into a docker image, it would be better to mount volumes for modules and cache so that additional dependencies included later doesn't need to be downloaded again. Typical directories that you should consider creating volumes for are npm_modules (one for global and one for local) and .npm (cache).

Your package.json is being copied into root / and the same package.json is being copied into /web with COPY . ..

The initial run of npm i is installing into / and you're running it again for /web. You're downloading dependencies twice but are the modules in / going to be used for anything? Regardless, you appear to be using the same package.json in both npm i and ng build, so the same thing is being done twice, ( [EDIT] - It would seem that ng build doesn't redownload packages) but node_modules isn't available in / so the npm i command creates another one and re-downloads all packages.

You create a web directory in root / but there are other commands instructing to relative paths ./web. Are you certain that things are running in the right places? There is no guarantee that programs would be looking in the directories you want them to if you use relative paths. While it may appear to work for this image, the same practice will not be consistent across other images that may have different initial work directories.

[may or may not be relevant information]

Although I'm not using Bitbucket for automating builds, I faced a similar issue when running Jenkins pipelines. Jenkins placed the project in a different directory so that every time it runs, all the dependencies would be downloaded again. I initially thought the project would be in /home/agent/project but it was actually placed elsewhere. I found the directory where the project was copied to by using the pwd and npm cache verify command in a build step, then mounted the volumes to the correct places. You can view the output in the logs generated on builds.

You can view the output by expanding the section within the pipelines page.

enter image description here

If the image is being rebuilt on every run, build your image separately then push the image to a registry. Configure the pipeline file to use your image instead. You should try to use already available base images whenever possible unless there are other dependencies you need that are unavailable in the base image (things like alpine's apk packages and not npm. npm dependencies can be stored in volumes). If you're going to use a public registry, do not store any files that may contain sensitive data. Configure your pipeline so that things are mounted with volumes and/or uses secrets.

A basic restructure on the test and build steps.

       Image on Docker Hub
              |
              |
           ---|-------------------------------------|
           |                       |                |
           V                       V                |
Commit -- build (no test) ---> e2e tests (no build)-]--+--> archive build --> (deploy/merge/etc)
                         |           _______________|  ^
                         |           v                 |
                         |-> unit tests (no build)---->|

You don't need to follow it entirely, but it should give you an idea on how you could use parallel steps to separate things and improve completion times.

like image 117
ToninGuy3n Avatar answered Oct 22 '22 01:10

ToninGuy3n