Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why move node_modules to parent directory

In the Docker extension for Visual Studio Code, in the Node.js Dockerfile template, the node_modules directory is moved to the parent directory, in this line:

RUN npm install --production --silent && mv node_modules ../

The mv node_modules ../ looks useless to me. Is there a reason to do that?

like image 456
nuno Avatar asked Jan 15 '19 12:01

nuno


People also ask

Should I push node_modules to Git?

For reference, npm FAQ answers your question clearly: Check node_modules into git for things you deploy, such as websites and apps. Do not check node_modules into git for libraries and modules intended to be reused. Use npm to manage dependencies in your dev environment, but not in your deployment scripts.

What is the purpose of node_modules?

You can think of the node_modules folder like a cache for the external modules that your project depends upon. When you npm install them, they are downloaded from the web and copied into the node_modules folder and Node. js is trained to look for them there when you import them (without a specific path).

Do I need to push node_modules in production?

Not committing node_modules implies you need to list all your modules in the package. json (and package-lock. json ) as a mandatory step. This is great because you might not have the diligence to do so, and some of the npm operations might break if you don't.

Can I move node_modules?

You can if you'd like, but it is not considered best practice. In any case, you shouldn't make any modifications to the files inside node_modules . Preferably you need to only copy over your package. json file and optionally, your package-lock.


1 Answers

UPDATE: found a good explanation of the "why". See: the section on Dealing With Node Modules.

TLDR; lets you run the app in and out of the container without worry that the OS-specific items in node_modules will not work between the host and the container.

Original (wrong as pointed out in the comments) answer:

This is to take advantage of the caching mechanism in Docker. Typically you'll have the following in your Dockerfile:

1. RUN apt-get #bunch of stuff you want installed
2. COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*", "./"]
3. RUN npm install --silent && mv node_modules ../
4. COPY . .
5. CMD node index.js

Docker builds its images in "layers", you can think of each layer being added as each line in the Dockerfile is executed.

Then what we are saying in snipped above is:

  1. Run apt-get and commit the layer to the cache
  2. Next, copy the package.json file from the host to the container, commit the layer
  3. Run npm install, then move the node_modules folder up one directory, npm will recursively search upward until it finds an node_modules folder, so your app won't care. Commit this layer.
  4. Copy your source code contained in the current folder, to the container, commit this layer
  5. Finally, when your container is run, use the command node index.js

The beauty of this system is that Docker will re-build a container only from the point where things have changed since the last time the container was built. So in our case, it'll only run npm install… if the previous layer had changes, say an updated package.json. If not, then it won't execute the npm install command, even if, say, index.js might have changed.

Of course, keep in mind you should have node_modules listed in your .dockerignore file so that the COPY . . doesn't try to pick it up.

like image 198
ThaDon Avatar answered Oct 07 '22 06:10

ThaDon