Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Issues after trying to repack a git repo for improved performance

Tags:

git

A while ago I posted a question asking for feedback on a plan to fix a repo that was slow because of a lot of big, binary files. That question (not a must-read for this one): Fixing up a git repo that is slowed because of big binary files

I followed through with my plan, and experienced unexpected side results.

A fresh clone of our repo originally took 2-3 hours. Figured out that the server started swapping, and after doing git config pack.windowMemory 100m && git config pack.packSizeLimit 200m, the clone time sank to ~15 minutes.

I figured I'd still do the rest of my plan, so I disabled delta compresson for the binary types we have, and ran git repack -a -d -F on the repo.

After this, a fresh clone of the repo takes ~20 minutes, so it actually got worse. But the real problem is that every time someone that had already cloned the repo tries to push commits they get "Auto packing the repository for optimum performance.".

Any ideas on what might have gone wrong, and how it can/should be fixed?

like image 796
anr78 Avatar asked Nov 13 '12 19:11

anr78


1 Answers

Probably the size of your repo and your low value for pack.packSizeLimit makes the number of packs always be above gc.autopacklimit. So increase either of them to make sure the gc doesn't run each commit.

I'm not sure in what ways packSizeLimit would affect memory, but I don't believe it would have any significant effect. Please correct me if your experiments show otherwise. The parameters that directly affect memory use are pack.windowMemory and pack.deltaCacheSize.

like image 132
clacke Avatar answered Nov 15 '22 21:11

clacke