I am using git-fat with git to handle large binary files. The problem I am trying to solve is to prevent large binaries being added to local commits IF they are not going to be handled by git-fat.
My approach is to update the pre-commit to block all binary files and only allow files IF:
git fat init
was rangit check-attr filter
on the file to verify filter: fat
is in the outputAm I missing a trick with this? What is a good way to solve this problem with git
and git-fat
For example, Is there anything I can do with the .git/fat/objects to determine that git-fat handled the file when git add
was called on the binary file?
Am I missing a trick with this?
It is best to add a restriction on the server side, which will limit/block the size of a push, preventing any large binary that might have fall through the cracks to go through.
To detect if git fat init
was executed, you can simply query the value of git config filter.fat.clean
and git config filter.fat.smudge
.
git-fat is interesting in that it does not require any special feature/support on the Git hosting server itself: the storage/replacement of large files is handle by the local filters.
A more recent version of that same file management is git lfs, which also operates through Git filters.
Of course, the ultimate solution, which does not require filters and is fully automated is VFS for Git, the Git Virtual Filesystem developed by Microsoft, and slowly being merged back to Git itself. But it does require a Git repo hosting server which supports that capability.
The OP adds:
Is there anything I can do with the .git/fat/objects to determine that git-fat handled the file when git add was called on the file?
The command git fat verify does check each object: it could be tweaked to look a specific file, and check its sha1 between itself and the catalog_objects
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With