Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

git pull without remotely compressing objects

Tags:

git

I have a repository full of zip files, re-compressing theses files will be a waste of time.

I've tried to set core.compression = 0 on the remote and the local copy without success

git config core.compression 0 git config core.loosecompression 0 

git pull still do

remote: Counting objects: 23, done. remote: Compressing objects: ... 
like image 659
hdorio Avatar asked Aug 18 '11 03:08

hdorio


People also ask

Does git compress objects?

Git has a mechanism to compress all files (and indeed, all history) into pack files.

What is the difference between git pull and git fetch?

Git Fetch is the command that tells the local repository that there are changes available in the remote repository without bringing the changes into the local repository. Git Pull on the other hand brings the copy of the remote directory changes into the local repository.

Does git pull connect to remote?

Fetching and pulling from Git remotesBoth git fetch , and git pull can be used to read from a remote repository.

Does pull overwrite local changes?

git pull --force it feels like it would help to overwrite local changes. instead, it fetches forcefully but does not merge forcefully ( git pull --force = git fetch --force + git merge ). Like git push, git fetch allows us to specify which local and remote branch we want to work on.


1 Answers

The time problem I had was caused by delta compression.

The solution for me was

echo '*.zip -delta' > .gitattributes git gc 

I will quote this excellent response from Re: serious performance issues with images, audio files, and other "non-code" data:

Git does spend a fair bit of time in zlib for some workloads, but it should not create problems on the order of minutes.

For pushing and pulling, you're probably seeing delta compression, which can be slow for large files

 core.compression 0   # Didn't seem to work. 

That should disable zlib compression of loose objects and objects within packfiles. It can save a little time for objects which won't compress, but you will lose the size benefits for any text files.

But it won't turn off delta compression, which is what the "compressing..." phase during push and pull is doing. And which is much more likely the cause of slowness.

 pack.window 0 

It sets the number of other objects git will consider when doing delta compression. Setting it low should improve your push/pull times. But you will lose the substantial benefit of delta-compression of your non-image files (and git's meta objects). So the "-delta" option above for specific files is a much better solution.

 echo '*.jpg -delta' >> .gitattributes 

Also, consider repacking your repository, which will generate a packfile that will be re-used during push and pull.

Note that the settings have to be made on the repo you are fetching/pulling from, not the one you are fetching/pulling to.

like image 148
hdorio Avatar answered Sep 22 '22 09:09

hdorio