Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Automatically Build NPM Module On Install From Github

Given that a project's lib/ dir shouldn't be checked into Git because the files it contains are derived files (from the build process). When installing a package from the project's github (during development for example) the lib/ dir will not exist, so if the package's package.json's main field points to (for example) lib/index.js, the package cannot be compiled when imported because these files do not exist within the repository and therefore within the package installed into node_modules. This means the package needs to built (just as it would be before release), only this time locally so that the lib directory (or whatever other files are generated during the build process) are added to the module's directory.

Assuming there is a build script within the package.json file's scripts field, can the package be configured to run this automatically in the situation where it is installed from github only? If not, what is the the best approach to ensuring it is built when installed from github?

There are now prepublish, prepublishOnly and prepare lifecycle hooks, but none provide an answer to this problem because they don't allow any way to differentiate between the source of the install. In short, yes they allow you to build on install, but they don't allow you to build on install from github only. There is not only no reason to force a build when people install from npm, but more importantly, development dependencies will not be installed (for example babel which is critical to the build).

I am aware of one strategy to deal with this problem:

  • Fork / branch the repo
  • build locally
  • remove the lib/ dir from .gitignore and check it in.
  • install module from your fork/branch
  • When you are ready for a PR / rebase add lib/ dir to .gitignore and remove dir from git.

But that is far from ideal. I guess this could be automated with a githook though. So every you push to master the project also builds and pushes to a separate branch.

There is a closed issue on NPM's Github with no resolution - just lots of people who want a solution. From this issue it is clear that using prepare is not the answer.

My usecase is that I am developing a module that is used in a number of other projects. I want to consume the latest version of the module without having to push out a release to NPM whenever I update the codebase - I would rather push out fewer releases when I am ready, but I still need to consume whatever the latest version of the lib is on Github.

Note: I also reached out to NPM's support regarding this issue and I'll add their response if I receive one.

like image 273
Undistraction Avatar asked Jan 16 '18 18:01

Undistraction


People also ask

Can npm install from GitHub?

The npm installation from GitHub is quite useful for testing packages. It also gives the flexibility to install any specific branch, version, tag, and so on.

How do I automatically install npm dependencies?

to install the dependencies automatically , first of all list them manually in package. json file and run the npm install (sometimes sudo npm install ) command.


2 Answers

Edit: Detecting if the package is being installed from git repo

I didn't understand the question properly. Below are things that I wrote but are a bit off-topic. For now if you want to run build.js only when installing from repo:

Files in repo:

 .gitignore  .npmignore  ThisIsNpmPackage  build.js  package.json 

The .gitginore:

ThisIsNpmPackage 

The .npmignore:

!ThisIsNpmPackage 

In the package.json:

"scripts": {     "install": "( [ ! -f ./ThisIsNpmPackage ] && [ ! -f ./AlreadyInstalled ] && echo \"\" > ./AlreadyInstalled && npm install . && node ./build.js ) || echo \"SKIP: NON GIT SOURCE\"" } 

The idea is to make file ThisIsNpmPackage available on the repo, but not in npm package.

Install hook it's just a piece of bashy script to check if ThisIsNpmPackage exists. If yes then we execute npm install . (this will ensure we have devDependencies. File AlreadyInstalled is generated to prevent infinite looping (npm install will recursively invoke install hook)

When publishing I do git push and npm publish
Note that npm publish can be automated via CI tools - githooks

This little hack with file ThisIsNpmPackage makes the source detection available.

Results of invoking npm install dumb-package:

"SKIP: NON-GIT SOURCE"

And executing npm install https://github.com/styczynski/dumb-package

Files will be built

The issues

The main issues we are facing here are the following ones:

  • Have to do npm publish ... everytime

    Sometimes it's too much pain to fix a small bug, then push to the repo and forget to publish on npm. When I was working with a microservices-based project that has about 5 standalone subprojects divided into modules the problem that I found an issue, fixed it and forget to publish in every place I had to was really annoying.

  • Don't want to push lib into the repo, because it's derived from sources

  • Rebasing and merging is even more annoying.

  • No mess with .gitgnore

    Heck, I know that problem when you have a troublesome files that you have to include inside repo but never modify them, or sometimes remove? That's just sick.

Edit: npm hooks

As @Roy Tinker mentioned there exist ability for a package to execute a command when installed.
It can be achieved via npm hooks.

"install": "npm run build" 

And we execute the:

npm install https://github.com/<user>/<package>

Edit:
OP question from comments:

But this will run an install for everyone downloading the module from npm right? This is hugely problematic given that dev dependencies will not be installed for anyone downloading the module from npm. The libs used to build the app - babel etc will not be installed.

Note: But if you want a specific version of the package (production/dev) with or without dev dependencies you can install it via:

npm install --only=dev

The --only={prod[uction]|dev[elopment]} argument will cause either only devDependencies or only non-devDependencies to be installed regardless of the NODE_ENV.

A better solution, in my opinion, is to use:

npm install <git remote url> 

And then inside package.json specify:

"scripts": {     "prepare": "npm run build" } 

If the package being installed contains a prepare script, its dependencies and devDependencies will be installed, and the prepare script will be run, before the package is packaged and installed.

Example:

npm install git+https://[email protected]/npm/npm.git 

Read the npm docs there: npm install

Edit: proxy module (advanced technique)

It's kind of bad practice, but good to know.

Sometimes (as in case of Electron framework you need to install other external packages or resources/modules depending on various conditions).

In these cases the proxy idea is used:

  • You make a module that behaves like installer and installs all depending things you want

In your case prepare script will be enough, but I leave this option, because it may be sometimes helpful.

The idea is that you write a module and write a install kook for it:

"scripts": {     "install": "<do the install>" } 

In this scenario you can place there:

npm install . && npm run build 

Which install all devDependencies anyway (as beforementioned prepare case do), but it's a bit of hacking.

If you want do the real hacking there:

 "scripts": {     "install": "curl -L -J -O \"<some_url>\""  } 

which manually download files using -nix command curl

It should be avoided but it's an option in case of the module that has huge binary files for each platform and you don't want to install them all.

Like in case of Electron where you have compiled binaries (each for the separate platform)

So you want people to make install package not install package-linux or package-window etc.

So you provide custom install script in the package.json

{   ...   "scripts": {      "install": "node ./install_platform_dep.js"   } } 

Then when installing module the install_platform_dep.js script will be executed. Inside install_platform_dep.js you place:

// For Windows... if(process.platform === 'win32') {     // Trigger Windows module installation     exec('npm install fancy-module-windows', (err, stdout, stderr) => {          // Some error handling...     } } else if ... // Process other OS'es 

And this in purely manual way installs everything.

Note: Once again this approach is usable with platform-depending modules and if you use that it's probably design issue with your code.

Build on CI

What comes to my mind is the solution that I used really for a long time (automatic building with CI services).

Most of the CI services' main purpose is to test/build/publish your code when pushing to the branch or doing other actions with the repo.

The idea is that you provide settings file (like travis.yml or .gitlab-ci.yml) and the tools take care of the rest.

If you really don't want to include the lib into the project, just trust CI to do everything:

  • Githook will trigger building on commit (on a branch or any other - it's just a matter of configs)
  • CI will build your files then pass them to the test phase and publish

Now i'm working on Gitlab on my own project doing (as a part of hobby) some webpage. The Gitlab configuration that builds the project looks like this:

image: tetraweb/php  cache:   untracked: true   paths:     - public_html/     - node_modules/  before_script:   - apt-get update  stages:   - build   - test   - deploy    build_product:   stage: build   script:     - npm run test  build_product:   stage: build   script:     - npm run build    deploy_product:   stage: deploy   script:     - npm run deploy 

When I merge into the main branch the following events happen:

  • CI runs build stage
  • If the build succeeds then test stage is launched
  • If test phase is ok then finally the stage deploy is triggered

The script is the list of unix commands to be executed.

You can specify any Docker image in the config, so use in fact any Unix version you want with some (or not) preinstalled tools.

There is a package deploy-to-git which deploys artefacts to the desired repo branch.

Or here (for Travis CI) the piece of config that publishes artefacts to the repo:

travis-publish-to-git

(I used it by myself)

Then, of course, you can let CI run:

npm publish . 

Because CI executes Unix commands then it can (at least a bunch of CI providers there):

  • Publish tags (release tag maybe?)
  • Trigger script to update version of the project in all READMEs and everywhere
  • Send you a notification if all phases succeeded

So what I do:
I commit, push and let the tools do everything else I want.
In the meantime, I make other changes and after one to ten minutes get update report by mail.

There is plenty of CI provider there:

  • Travis CI
  • Circle CI
  • Gitlab CI for Gitlab Projects

Here I attach another example of my other project (.travis.yml):

language: generic install:     - npm install script:     - chmod u+x ./utest.sh      - chmod u+x ./self_test/autodetection_cli/someprogram.sh     - cd self_test && bash ../utest.sh --ttools stime --tno-spinner 

If you set up CI to push and publish your package you can be always sure to use the latest cutting-edge version of your code without worrying about eh I have to run also this command now... problem.

I recommend you to choose one of the CI providers out there.
The best ones offer you hundreds of abilities!

When you get used to automatically doing publish, test and build phases you will see how it helps to enjoy the life!
Then to start another project with automatic scripts just copy the configs!

Summary

In my opinion npm prepare script is an option.
You can also maybe want to try others.

Each of the described methods has it's drawbacks and can be used depending on what you want to achieve.
I just want to provide some alternatives hope some of them will fit your problem!

like image 68
Piotr Styczyński Avatar answered Oct 18 '22 04:10

Piotr Styczyński


prepare is the correct way, but might seem broken

If you have a repository with source files but a "build" step is necessary to use it,
prepare does exactly what you want in all cases (as of npm 4).

prepare: Run both BEFORE the package is packed and published, on local npm install without any arguments, and when installing git dependencies.

You can even put your build dependencies into devDependencies and they will be installed before prepare is executed.

Here is an example of a package of mine that uses this method.


Problems with .gitignore - prepare will seem broken

There is one issue with this option that gets many people. When preparing a dependency, Npm and Yarn will keep only the files that are listed in the files section of package.json.

One might see that files defaults to all files being included and think they're done. What is easily missed is that:

  • .npmignore mostly overrides the files directive and,
  • if .npmignore does not exist, .gitignore is used instead.

So, if you have your built files listed in .gitignore, like a sane person, and don't do anything else, prepare will seem broken.

If you fix files to only include the built files or add an empty .npmignore, you're all set.

My recommendation is to set files (or, by inversion, .npmignore) such that the only files actually published are those needed by users of the published package. Imho, there is no need to include uncompiled sources in published packages.

like image 39
Cameron Tacklind Avatar answered Oct 18 '22 05:10

Cameron Tacklind