Python's requests library only supports timeout on connect and read http://docs.python-requests.org/en/master/user/advanced/#timeouts
There is no way to force a timeout when DNS lookup takes a very long time. I would like to trigger time out when it takes more than X seconds to complete GET request (including DNS, connect and read).
Note that I cannot use signal based approach since it works only in main thread.
I am looking for an elegant solution.
I don't think it's possible to interrupt an underlying getaddrinfo
C standard library function other than a signal.
So IMHO you can only overcome your problem with multi-processing - for example an elegant approach using timeout_decorator module:
@timeout_decorator.timeout(5, use_signals=False)
def timed_get(url):
return requests.get(url)
But remember that this would create a process for each request.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With