I'd like to know if it's possible to minimize the download times of large files when using git lfs.
specifically, the following scenarios:
[I know git annex has better support for these features, but it's windows support is problematic]
To the best of my knowledge Git LFS does keep files when switching branches - its checksum based and holds all blobs locally under .git/lfs/objects once it retrieved them once.
As for pointing lfs to a different endpoint - that's already supported: in your .git/config
you can modify the lfs url it points to:
[remote "origin"]
url = https://...<repo_url>
fetch = +refs/heads/*:refs/remotes/origin/*
lfsurl = "https://<another repo that's closer to you>"
Also there are several services that provide lfs support so you can keep the storage on your local corp network like Artifactory, GitHub Enterprise and Bitbucket, depending on what your usecase is.
You might find this issue's conversation helpful as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With