I want to use Git to track micro changes (several a day) on a large working directory (many gigs). The data will be mixed binary/plain text. Binary data won't change nearly as much as the text information. Access to old commits will rarely be needed, and can be slow, whereas recent history needs to be fast.
I don't want to lose old data permanently, just move it to a backup server or something. Is there something in Git that allows old history to be archived and keep only a certain subset in the local repository?
If not, is there a tool that is more suited for this purpose? I like Git because I know it and I want the version control and diffs. I won't be needing any of the advanced features of Git (like branching/merging, not distributed), so any other similar VCS would be nice.
If you're patching with git format-patch
, then create a shallow clone with git clone --depth <depth>
and procede. Odds are you're not, though, in which case you'll probably find this answer and this answer useful. The second concludes that git checkout --orphan
is perhaps the best way to get what you want. Of course, you'll still need to clone the complete history locally once to make a smaller branch of it.
If you're feeling adventurous, want this badly, and are willing to put up with a more complicated push process, creating patches with git format-patch
and applying them to another repository with git am
is neither difficult to execute nor to script. It would add an extra layer to your push process -- e.g. create a patch on shallow repo, apply programmatically to a full repo, which is either local or somewhere else, push from the latter. The time and trouble probably isn't worth it, but it certainly is possible.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With