I've read the existing questions about storing binary files in a Git repository, but some aspects are still not clear.
The repository contains around 50 MB of code sources and around 1 GB of binary files. The binary files are seldom changed.
If the files are never involved, it doesn't make any difference in terms of performance.
Each commit marks the modified files, so when a commit it's being applied the files that are not tagged in it they don't not really matter, whether they are 1Kb or 1 Gb. If the file appears in a commit it will obviously matter, as typically binary files are slower to deal with.
Now, the main problem is that cloning a repository is not the only action that involves applying commits. For instance when you change to a different branch git has to remove the application of all commits until the common one, and then apply all commits of the other branch until reaching the desired checkout commit, or when merging or rebasing git has to analyse all commits to find the differences.
Basically whenever a commit containing modifications on a binary file has to be read by git, performance will very likely be affected, and because of the way git works, commits get "used" quite often.
About your question, it basically depends on what you mean by "seldom changed". As long as the branches you typically work on don't have modifications on binary files this shouldn't be a problem, but if you have modifications to track when checking out different commits, performance gets affected.
It can influence the operations like git gc
or git repack
, where deltification is done. See "Are Git's pack files deltas rather than snapshots?".
That is why I generally stored in version control only a text file declaring where to find the binaries I need, as opposed to storing the binaries themselves. See "git include compiled dll from another repository" as an example.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With