I want to get response time when I use urllib. I made below code, but it is more than response time. Can I get the time using urllib or have any other method?
import urllib
import datetime
def main():
    urllist = [
        "http://google.com",
    ]
    for url in urllist:
        opener = urllib.FancyURLopener({})
        try:
            start = datetime.datetime.now()
            f = opener.open(url)
            end = datetime.datetime.now()
            diff = end - start
            print int(round(diff.microseconds / 1000))
        except IOError, e:
            print 'error', url
        else:
            print f.getcode(), f.geturl()
if __name__ == "__main__":
    main()
                Save yourself some hassle and use the requests module. In its responses it provides a datetime.timedelta field called 'elapsed' that lets you know how long the request took.
>>> import requests
>>> response = requests.get('http://www.google.com')
>>> print response.elapsed
0:00:01.762032
>>> response.elapsed
datetime.timedelta(0, 1, 762032)
                        I prefer the requests library but was forced to use urllib3. I found that in urllib3, response.elapsed.total_seconds() is roughly equivalent to :
import datetime
import urllib3
http = urllib3.PoolManager()
url_string = "http://google.com"
start = datetime.datetime.now()
response = http.request('GET', url_string)
end = datetime.datetime.now()
delta = end - start
elapsed_seconds = round(delta.microseconds * .000001, 6)
print(elapsed_seconds)
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With