Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to limit the amount of memory that "git gc" uses?

I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work.

like image 792
sam2themax Avatar asked Jun 22 '10 17:06

sam2themax


People also ask

When should we not run git gc?

Running git gc manually should only be needed when adding objects to a repository without regularly running such porcelain commands, to do a one-off repository optimization, or e.g. to clean up a suboptimal mass-import.

What is git gc -- prune?

git gc is a parent command and git prune is a child. git gc will internally trigger git prune . git prune is used to remove Git objects that have been deemed inaccessible by the git gc configuration.

Should I run git gc?

You should consider running git gc manually in a few situations: If you have just completed a git filter-branch . Recall that filter-branch rewrites many commits, introduces new ones, and leaves the old ones on a ref that should be removed when you are satisfied with the results.


2 Answers

I used instructions from this link. Same idea as Charles Baileys suggested.

A copy of the commands is here:

git config --global pack.windowMemory "100m" git config --global pack.packSizeLimit "100m" git config --global pack.threads "1" 

This worked for me on hostgator with shared hosting account.

like image 104
hopeithelps Avatar answered Sep 18 '22 17:09

hopeithelps


Yes, have a look at the help page for git config and look at the pack.* options, specifically pack.depth, pack.window, pack.windowMemory and pack.deltaCacheSize.

It's not a totally exact size as git needs to map each object into memory so one very large object can cause a lot of memory usage regardless of the window and delta cache settings.

You may have better luck packing locally and transfering pack files to the remote side "manually", adding a .keep files so that the remote git doesn't ever try to completely repack everything.

like image 37
CB Bailey Avatar answered Sep 20 '22 17:09

CB Bailey