My project is six months old and git is very very slow. We track around 30 files which are of size 5 MB to 50 MB. Those are binary files and we keep them in git. I believe those files are making git slow.
Is there a way to kill all files of size > 5MB from the repository. I know I would lose all of these files and that is okay with me.
Ideally I would like a command that would list all the big files ( > 5MB) . I can see the list and then I say okay go ahead and delete those files and make git faster.
I should mention that git is slow not only on my machine but deploying the app on staging environment is now taking around 3 hours.
So the fix should be something that will affect the server and not only the users of repository.
Can Git Handle Large Files? Git cannot handle large files on its own. That's why many Git teams add Git LFS to deal with large files in Git.
Git can usually detect binary files automatically.
The first thing to determine is if the poor behavior is due to your machine or to your specific local copy of the repo. The files in your . git folder can affect performance in various ways - settings in . git/config , presence of lfs files, commits that can be garbage collected, etc.
Running git commit to commit your staged changes is generally fast because actually staging the changes did most of the work.
Do you garbage collect?
git gc
This makes a significant difference in speed, even for small repos.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With