I'm working with a large number of binary files. After a recent change to a local git repo, I tried to push my changes back up to the remote, only to receive the following error.
remote: fatal: pack exceeds maximum allowed size
Unfortunately I can't use the strategy described here, since all the changes are contained in a single commit. Any suggestions? How can I get around this pack size restriction?
A lot of serialized files are generated on code modification and rerun (so one giant commit with lots of smaller files)
That means you can split that huge commit in several smaller one.
git reset HEAD~
will be enough to "un-commit" all the files.Finally, modify your script (which by default adds and commit everything after that "serialized files" generation) in order to add and commit only a batch of files at a times (instead of everything).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With