Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python urllib2 timing

I'd like to collect statistics related to how long each phase of a web request takes. httplib offers:

def run(self):
    conn = httplib.HTTPConnection('www.example.com')
    start = time.time()
    conn.request('GET', '/')
    request_time = time.time()
    resp = conn.getresponse()
    response_time = time.time()
    conn.close()
    transfer_time = time.time()

    self.custom_timers['request sent'] = request_time - start
    self.custom_timers['response received'] = response_time - start
    self.custom_timers['content transferred'] = transfer_time - start

    assert (resp.status == 200), 'Bad Response: HTTP %s' % resp.status

Are these statistics available from a more high-level interface like urllib2? Is there high level library offering such statistics?

like image 695
user1367401 Avatar asked Aug 20 '12 12:08

user1367401


People also ask

What does urllib2 do in Python?

urllib2 is a Python module that can be used for fetching URLs.

Does urllib2 work in Python 3?

NOTE: urllib2 is no longer available in Python 3.

What is the difference between Urllib and urllib2?

1) urllib2 can accept a Request object to set the headers for a URL request, urllib accepts only a URL. 2) urllib provides the urlencode method which is used for the generation of GET query strings, urllib2 doesn't have such a function. This is one of the reasons why urllib is often used along with urllib2.


1 Answers

As mentioned in a related question a good way to do this now is to use the requests library. You can use it to measure the request latency, though I'm not sure if you can measure the content transfer timing. You could potentially do that by comparing a HEAD request to a GET request.

like image 64
Pierz Avatar answered Oct 14 '22 06:10

Pierz