In Git, there is a command git clone --depth=<depth>
to retrieve historical data only specific length. there are also a command to gather more historical data by use git fetch --depth=<depth>
.
How about when we want to free some spaces from large repository? I know that we may use git gc
or git prune
but are there other way to specific like --depth=<depth>
to reduce number of commit store in local repository? AND it also should keep SHA1 to be able continue to working with it.
The easiest way would be to:
git clone --depth=n /url/of/remote/repo
That would clone the last n commits, while allowing fetch/pull/psuh to still work with the remote repo.
since Git 2.5, you can fetch a single commit, but unless that commit is the latest one (which is like a git clone --depth=1
), that would not allo for fetch/pull/push.
The other approach to make sure a given local repo is as lean as possible is to use a combination of gc/prune/repack:
git gc --aggressive
git repack -Ad # kills in-pack garbage
git prune --progress # kills loose garbage
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With