I have a repo from which pull takes forever because the server has little free RAM and it is swapping a lot while
remote: Compressing objects: 24%
is happening (even if I clone locally on the server). The network is not that constrained, so it would be fine to send all data uncompressed. How can I do that?
Thirdly, Git stores its data as compressed objects, whereas SVN stores them as uncompressed copies.
By default, git stores objects in . git/objects as their original contents, compressed, and prepended with a few extra characters (see below). These are easy to read with a small amount of code. git will read the files for you with git show or git cat-file .
Delta compression (also called delta encoding, or just delta coding), is where only the differences to a known base file is stored, discarding any similarities. To decompress this, you apply the stored changes (also called “diffs”) to the base file, leaving you with the new file.
From the git documentation:
core.bigFileThreshold Files larger than this size are stored deflated, without attempting delta compression. Storing large files without delta compression avoids excessive memory usage, at the slight expense of increased disk usage. Default is 512 MiB on all platforms. This should be reasonable for most projects as source code and other text files can still be delta compressed, but larger binary media files won't be. Common unit suffixes of 'k', 'm', or 'g' are supported.
So I guess by setting this value to something like 1 would do the trick.
Extended by comments: you can set this with a git config --add core.bigFileThreshold 1
command. It works for bare repos as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With