I'm encountering an issue with Git where I'm receiving the following message:
> git fetch
error: cannot create pipe for ssh: Too many open files
fatal: unable to fork
The System Administrators have increased my file limit, but it has not corrected the issue. Additionally, I don't have an issue with creating new files with vi.
When trying to push a new branch, I get a similar message:
git push origin test_this_broken_git error: cannot create pipe: Too many open files fatal: send-pack: unable to fork off sideband demultiplexer
Could somebody answer exactly why this is happening? I have not made any recent changes to my git config and have verified that manually.
The git push command is used to upload local repository content to a remote repository. Pushing is how you transfer commits from your local repository to a remote repo. It's the counterpart to git fetch , but whereas fetching imports commits to local branches, pushing exports commits to remote branches.
From the git documentation:
When there are approximately more than this many loose objects in the repository, git gc --auto will pack them. Some Porcelain commands use this command to perform a light-weight garbage collection from time to time. The default value is 6700.
Here "Some Porcelain commands" includes git push
, git fetch
etc. So if the max open files limit ulimit -n
< 6700, you'll be eventually blocked by git gc --auto
once you got ~6700 loose objects in a single git repo.
If you have sufficient permissions to adjust the system ulimit:
$ sudo ulimit -n 8192
Otherwise, you may disable git gc
by setting git config gc.auto=0
, so that you could push your local commits to the remote, delete the repo, and clone it back without thousands of loose objects.
Set git config --global gc.auto=200
, where 200 is some value less than your max open files limit. If you picked a too small value, git gc
would run too frequently, so choose wisely.
If you set gc.auto=0
, the loose objects will never be packed unless you run git gc
manually. So there could be hundreds of thousands of files accumulated in the same directory, which might be a problem, especially for mechanical hard drive or Windows users. (See also: How many files in a directory is too many? and Is it OK (performance-wise) to have hundreds or thousands of files in the same Linux directory?).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With