I would like to create a clean archtecture for my JavaScript project. The project consists of one Node.js server and two separate Angular.js front-ends with different purposes. For building the front-ends I use a custom grunt build each. The build results in one single HTML file per project and two minified/uglified CSS and JavaScript files. Each front-end is then running on a separate minimal version of a Node server (serves only the static files).
So far, so clear. The goal is now to make it possible to add plugin modules to each one of the three core projects. A module should extend the JavaScript of either one of the projects. This means e.g. in case of one front-end to add an additional angular module to the angular configuration. I already know where and how to add the angular module code into the core app.
The problem is now: How do you create a reasonable build process over multiple projects which is also depending on plugin modules? I came up with two solutions.
How would you solve that problem?
IMHO, it's not that related to task manager work (aka gulp, grunt or even webpack, it doesn't really matter here). The community goes to a place where you own lots of (relatively) tiny node modules that do one thing and do it well, so it's pretty connected to your first suggestion.
A VCS repo could look like this structure:
my-repo/
package-1/
package.json
package-2/
package.json
package-3/
package.json
...
... then you work with npm link
to make symlinks inside the modules themselves, that way you don't have to publish the modules for getting updates.
There's a pretty new package called lerna that does exactly this npm link
thingy, automatically (it detects your dependencies graph in your "local" modules and make a link between them), all you need to do is follow their structure. In addition, it makes publish smartly, you can run commands in all packages and bunch of other related things.
Babel, React are great examples of this modules-separation-of-concern. Babel works with lerna
to automate the linking between the packages, here are their reasons to have a monorepo.
I usually do exactly what you have described under 1. But I use local dependencies that will be concatenated by my build chain.
npm install ~/project_root/sub_project --save-dev
this way you do not need to push it all the time. you simply delete the node_module/sub_project folder and run npm install
again inside your build chain.
with browserify you can then add your dependencies to your angular project.
A plugin is not runnable on its own.
I usually develop my services or directives as regular JavaScript classes and wrap them with angular.
var Service = require("./path/to/service.js")
module.exports = [function() {
return {
restrict: 'AC',
link: function($scope, $element) {
var someService = new Service();
$scope.$watch("whatever",function(value){
service.doSomething();
})
}
};
}];
This way you can run them and test them independently from angular. This also comes in advantage when working with WebWorkers. Since they can not really(fully) be used with angular 1.X.
I had a similar problem a few months ago. Installing dynamic dependencies could be solved fairly easy since you are familiar with streaming building systems like Grunt or Gulp.
package.json
like dependencies or devdependencies
which will hold all additional info you need, bower.json
or even a custom JSON file will do the trick too. I prefer NPM packaging manifests or Bower manifests as they provide a valid entry point for each module.package.json
file.main
property from each additional package.json will give you a file list to initialize an uglify or concatenating task .If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With