I'd like to collect statistics related to how long each phase of a web request takes. httplib
offers:
def run(self): conn = httplib.HTTPConnection('www.example.com') start = time.time() conn.request('GET', '/') request_time = time.time() resp = conn.getresponse() response_time = time.time() conn.close() transfer_time = time.time() self.custom_timers['request sent'] = request_time - start self.custom_timers['response received'] = response_time - start self.custom_timers['content transferred'] = transfer_time - start assert (resp.status == 200), 'Bad Response: HTTP %s' % resp.status
Are these statistics available from a more high-level interface like urllib2
? Is there high level library offering such statistics?
urllib2 is a Python module that can be used for fetching URLs.
NOTE: urllib2 is no longer available in Python 3.
1) urllib2 can accept a Request object to set the headers for a URL request, urllib accepts only a URL. 2) urllib provides the urlencode method which is used for the generation of GET query strings, urllib2 doesn't have such a function. This is one of the reasons why urllib is often used along with urllib2.
As mentioned in a related question a good way to do this now is to use the requests library. You can use it to measure the request latency, though I'm not sure if you can measure the content transfer timing. You could potentially do that by comparing a HEAD request to a GET request.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With