Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Response time for urllib in python

Tags:

python

urllib

I want to get response time when I use urllib. I made below code, but it is more than response time. Can I get the time using urllib or have any other method?

import urllib
import datetime

def main():
    urllist = [
        "http://google.com",
    ]

    for url in urllist:
        opener = urllib.FancyURLopener({})
        try:
            start = datetime.datetime.now()
            f = opener.open(url)
            end = datetime.datetime.now()
            diff = end - start
            print int(round(diff.microseconds / 1000))
        except IOError, e:
            print 'error', url
        else:
            print f.getcode(), f.geturl()

if __name__ == "__main__":
    main()
like image 313
Edward Avatar asked Jun 05 '13 03:06

Edward


2 Answers

Save yourself some hassle and use the requests module. In its responses it provides a datetime.timedelta field called 'elapsed' that lets you know how long the request took.

>>> import requests
>>> response = requests.get('http://www.google.com')
>>> print response.elapsed
0:00:01.762032
>>> response.elapsed
datetime.timedelta(0, 1, 762032)
like image 179
synthesizerpatel Avatar answered Nov 16 '22 04:11

synthesizerpatel


I prefer the requests library but was forced to use urllib3. I found that in urllib3, response.elapsed.total_seconds() is roughly equivalent to :

import datetime
import urllib3

http = urllib3.PoolManager()

url_string = "http://google.com"
start = datetime.datetime.now()
response = http.request('GET', url_string)
end = datetime.datetime.now()
delta = end - start

elapsed_seconds = round(delta.microseconds * .000001, 6)
print(elapsed_seconds)
like image 38
its30 Avatar answered Nov 16 '22 04:11

its30