Is there a way to improve the workflow when working with large npm dependencies in Gitlab CI?  I have a project where puppeteer is an npm dependency and since puppeteer is essentially chrome, it is pretty large (two thirds of the entire node_modules directory).
Is there a way to kind of pre-install puppeteer into the pipeline? So that there is no need to up- and download the entire dependency every time to and from the cache server? Also, it would be great if it does not need to be installed every time with npm install.
Check if caching node dependencies can help:
If your project uses npm to install Node.js dependencies, the following example defines cache globally so that all jobs inherit it.
By default, npm stores cache data in the home folder (
~/.npm).
However, you can’t cache things outside of the project directory.
Instead, tell npm to use./.npm, and cache it per-branch:
#
# https://gitlab.com/gitlab-org/gitlab/-/tree/master/lib/gitlab/ci/templates/Nodejs.gitlab-ci.yml
#
image: node:latest
# Cache modules in between jobs
cache:
  key: $CI_COMMIT_REF_SLUG
  paths:
    - .npm/
before_script:
  - npm ci --cache .npm --prefer-offline
test_async:
  script:
    - node ./specs/start.js ./specs/async.spec.js
If you need that same cache for each pipeline execution (not just shared across jobs inside one pipeline), you might need to save that .npm as an artefact, that you can restore at the next pipeline execution, as a local .npm/ cache.
See "GitLab CI: Cache and Artifacts explained by example" from Anton Yakutovich
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With