Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

does requests make the retry method execute after some seconds?

here is my code.

import requests,time
proxies = {'http':'36.33.1.177:21219'}
url='http://218.94.78.61:8080/newPub/service/json/call?serviceName=sysBasicManage&methodName=queryOutputOtherPollutionList&paramsJson=%7B%22ticket%22:%22451a9846-058b-4944-86c6-fccafdb7d8d0%22,%22parameter%22:%7B%22monitorSiteType%22:%2202%22,%22enterpriseCode%22:%22320100000151%22,%22monitoringType%22:%222%22%7D%7D'

i = 0
a = requests.adapters.HTTPAdapter(max_retries=10)
s = requests.Session()
s.mount(url, a)
for x in xrange(1,1000):
    time.sleep(1)
    print x
    try:
        r= s.get(url,proxies=proxies)
        print r
    except Exception as ee:
        i = i + 1
        print ee
        print 'i=%s' % i

the proxies is a little unstabitily,so I set up the max_retries, but it still have exception sometime, so is there some method to execute after some secondes at every retry??

like image 932
no13bus Avatar asked Dec 20 '22 10:12

no13bus


1 Answers

Just with requests library it's not possible. However you can use external library like backoff.

backoff provides a decorator and you wrap it around your function. Sample code:

@backoff.on_exception(backoff.constant,
                      requests.exceptions.RequestException,
                      max_tries=10, interval=10)
def get_url(url):
    return requests.get(url)

The above code waits for 10 seconds for next retry on every exception of requests.exceptions.RequestException and it tries for 10 times, as specified in max_tries.

like image 183
avi Avatar answered Dec 28 '22 06:12

avi