all questions I found want to avoid timeouts in git push/pull. In my case I want to force them. My push + pulls are all going over ssh to remote machines that might be unavailable at some point in time. For example, I have a script that pushes to two remote public repos. I don't want that this script hangs forever when it pushes to the first repo and that machine is unavailable. Instead, after some timeout i want the push to fail and continue with the second repo.
Any options here?
I don’t think you can do an automatic fail-over with built-in features. But since Git just uses SSH underneath, it should work to add a ConnectTimeout
option for the machines in question in your .ssh/config
. Cf. man ssh_config
. Then something like git push foo || git push bar
in the shell should do what you want.
from https://github.com/git/git/blob/master/Documentation/config.txt (around line 1770 at time of writing)
http.lowSpeedLimit, http.lowSpeedTime:: If the HTTP transfer speed is less than 'http.lowSpeedLimit' for longer than 'http.lowSpeedTime' seconds, the transfer is aborted. Can be overridden by the 'GIT_HTTP_LOW_SPEED_LIMIT' and 'GIT_HTTP_LOW_SPEED_TIME' environment variables.
I call it the Codeplex tweak. Add it to your .gitconfig
[http]
lowSpeedLimit = 1000
lowSpeedTime = 20
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With