I have a Node.js
project (actually a Firebase
project) where I have the code on Google Drive
. (I could use for example Dropbox
instead here. The important thing is that the code files are mirrored.)
Now I want to develop this project on another computer too. I do not understand how to handle the node_modules
directory. There is currently a stunning 15 000 number of files there. Should I exclude them from mirroring?
Or is this just a bad setup?
You do not need to include the node_modules folder when you manually upload code as a . zip file—you can successfully deploy your function without the the node_modules folder.
The solution I came across was to first create an empty node_modules folder and then sync that with the cloud. and deselected the node_modules folder. Then when you run npm install in the project root folder, OneDrive will detect a conflict with the contents of the folder in the cloud. It will not sync.
Check node_modules into git for things you deploy, such as websites and apps. Do not check node_modules into git for libraries and modules intended to be reused. Use npm to manage dependencies in your dev environment, but not in your deployment scripts.
This always bothered me with Dropbox, because the selective sync is a hassle. You have to select every single folder you want to exclude. There were so many requests for a .dropboxignore file, but Dropbox wouldn't listen. But I don't know if there are any Cloud Storages which would offer this feature.
EDIT: Dropbox seems to be working on feature which is currently in beta https://help.dropbox.com/en-us/files-folders/restore-delete/ignored-files
Anyways: Using git is always good, but if you are working on several devices and you don't want to commit and pull/push your work everytime you switch devices then you can store your git inside your Dropbox. But still there are the node_modules being synced for hours for nothing.
To solve this problem I built a tiny function that moves node_modules outside of the Dropbox and then creates softlinks to them. So Dropbox is only syncing the link-file, not the contents.
DROPNODE=/usr/local/node_modules_dropbox
init_dropnode() {
if [ -d $DROPNODE ]; then
mkdir -p $DROPNODE
fi
}
dropnode(){
bp=$DROPNODE/`basename "$PWD"`
p=$bp/node_modules
if [ -L "./node_modules" ]; then
rm node_modules
mkdir -p $p
else
if [ -d "./node_modules" ]; then
mkdir -p $bp
mv node_modules $p
else
mkdir -p $p
fi
fi
ln -s $p node_modules
}
I put these in my .bash_profile so when I'm in a project folder I just do dropnode
and the hassle is outside of my Dropbox.
Disclaimer: this only works on Linux/MacOs and you need no set the same path for the node_modules dump folder on every computer. And if you are mounting code into a Docker container, this will not work, because symlinks are being ignored.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With