Is it possible to limit the download rate of GET requests using the requests
Python library? For instance, with a command like this:
r = requests.get('https://stackoverflow.com/')
...is it possible to limit the download rate? I'm hoping for something similar to this wget
command:
wget --limit-rate=20k https://stackoverflow.com/
I know it's possible with urllib2
. I'm asking specifically about the requests
library.
Check the awesome library ratelimit. Perfect if you just want to rate limit your calls to an rest api for whatever reason and get on with your life. This will block the thread if more requests than one per minute is issued.
requests - Easily the most popular package for making requests using Python. urllib3 - Not to be confused with urllib , which is part of the Python standard library. httplib2 - Fills some of the gaps left by other libraries. httpx - A newer package that offers HTTP/2 and asynchronous requests.
What can Requests do? Requests will allow you to send HTTP/1.1 requests using Python. With it, you can add content like headers, form data, multipart files, and parameters via simple Python libraries. It also allows you to access the response data of Python in the same way.
The requests library is the de facto standard for making HTTP requests in Python. It abstracts the complexities of making requests behind a beautiful, simple API so that you can focus on interacting with services and consuming data in your application.
There are several approaches to rate limiting; one of them is token bucket, for which you can find a recipe here and another one here.
Usually you would want to do throttling or rate limiting on socket.send()
and socket.recv()
. You could play with socket-throttle
and see if it does what you need.
This is not to be confused with x-ratelimit
rate limiting response headers, which are related to a number of requests rather than a download / transfer rate.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With