I'm trying to build a tool for testing the delay of my internet connection, more specifically web site load times. I thought of using the python requests module for the loading part.
Problem is, it's got no built-in functionality to measure the time it took to get the full response. For this I thought I would use the timeit
module.
What I'm not sure about is that if I run timeit like so:
t = timeit.Timer("requests.get('http://www.google.com')", "import requests")
I'm I really measuring the time it took the response to arrive or is it the time it takes for the request to be built, sent, received, etc? I'm guessing I could maybe disregard that excecution time since I'm testing networks with very long delays (~700ms)?
Is there a better way to do this programatically?
There is such functionality in latest version of requests:
https://requests.readthedocs.io/en/latest/api/?highlight=elapsed#requests.Response.elapsed
For example:
requests.get("http://127.0.0.1").elapsed.total_seconds()
As for your question, it should be the total time for
Other ways to measure a single request load time is to use urllib:
nf = urllib.urlopen(url)
start = time.time()
page = nf.read()
end = time.time()
nf.close()
# end - start gives you the page load time
response.elapsed
returns a timedelta object with the time elapsed from sending the request to the arrival of the response. It is often used to stop the connection after a certain point of time is elapsed
.
# import requests module
import requests
# Making a get request
response = requests.get('http://stackoverflow.com/')
# print response
print(response)
# print elapsed time
print(response.elapsed)
output:
<Response [200]>
0:00:00.343720
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With