Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does git fail on push/fetch with "Too many open files"

Tags:

git

linux

config

I'm encountering an issue with Git where I'm receiving the following message:

> git fetch
error: cannot create pipe for ssh: Too many open files
fatal: unable to fork

The System Administrators have increased my file limit, but it has not corrected the issue. Additionally, I don't have an issue with creating new files with vi.

When trying to push a new branch, I get a similar message:

git push origin test_this_broken_git error: cannot create pipe: Too many open files fatal: send-pack: unable to fork off sideband demultiplexer

Could somebody answer exactly why this is happening? I have not made any recent changes to my git config and have verified that manually.

like image 455
Hazok Avatar asked Mar 13 '13 01:03

Hazok


People also ask

What is git push pull fetch?

The git push command is used to upload local repository content to a remote repository. Pushing is how you transfer commits from your local repository to a remote repo. It's the counterpart to git fetch , but whereas fetching imports commits to local branches, pushing exports commits to remote branches.


Video Answer


1 Answers

Why did this happen?

From the git documentation:

When there are approximately more than this many loose objects in the repository, git gc --auto will pack them. Some Porcelain commands use this command to perform a light-weight garbage collection from time to time. The default value is 6700.

Here "Some Porcelain commands" includes git push, git fetch etc. So if the max open files limit ulimit -n < 6700, you'll be eventually blocked by git gc --auto once you got ~6700 loose objects in a single git repo.

I'm in a hurry. How to fix it?

If you have sufficient permissions to adjust the system ulimit:

$ sudo ulimit -n 8192

Otherwise, you may disable git gc by setting git config gc.auto=0, so that you could push your local commits to the remote, delete the repo, and clone it back without thousands of loose objects.

How can we prevent this from happening again?

Set git config --global gc.auto=200, where 200 is some value less than your max open files limit. If you picked a too small value, git gc would run too frequently, so choose wisely.

If you set gc.auto=0, the loose objects will never be packed unless you run git gc manually. So there could be hundreds of thousands of files accumulated in the same directory, which might be a problem, especially for mechanical hard drive or Windows users. (See also: How many files in a directory is too many? and Is it OK (performance-wise) to have hundreds or thousands of files in the same Linux directory?).

like image 155
Arnie97 Avatar answered Oct 05 '22 21:10

Arnie97