How can I set maximum number of requests per second (limit them) in client side using aiohttp?
get is that requests fetches the whole body of the response at once and remembers it, but aiohttp doesn't. aiohttp lets you ignore the body, or read it in chunks, or read it after looking at the headers/status code. That's why you need to do a second await : aiohttp needs to do more I/O to get the response body.
By default the aiohttp. ClientSession object will hold a connector with a maximum of 100 connections, putting the rest in a queue. This is quite a big number, this means you must be connected to a hundred different servers (not pages!) concurrently before even having to consider if your task needs resource adjustment.
If you need to add HTTP headers to a request, pass them in a dict to the headers parameter. await session. post(url, data='Привет, Мир!
Although it's not exactly a limit on the number of requests per second, note that since v2.0, when using a ClientSession
, aiohttp
automatically limits the number of simultaneous connections to 100.
You can modify the limit by creating your own TCPConnector
and passing it into the ClientSession
. For instance, to create a client limited to 50 simultaneous requests:
import aiohttp connector = aiohttp.TCPConnector(limit=50) client = aiohttp.ClientSession(connector=connector)
In case it's better suited to your use case, there is also a limit_per_host
parameter (which is off by default) that you can pass to limit the number of simultaneous connections to the same "endpoint". Per the docs:
limit_per_host
(int
) – limit for simultaneous connections to the same endpoint. Endpoints are the same if they are have equal(host, port, is_ssl)
triple.
Example usage:
import aiohttp connector = aiohttp.TCPConnector(limit_per_host=50) client = aiohttp.ClientSession(connector=connector)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With