I have some code making sequence of requests to some API. I'd like to set up common logging for all, how can I set this up?
Let's say my code looks like this
import aiohttp
import asyncio
async def fetch(client):
async with client.get('http://httpbin.org/get') as resp:
assert resp.status == 200
return await resp.text()
async def post_data(client):
async with client.post('http://httpbin.org/post', data={'foo': 'bar'}) as resp:
assert resp.status == 200
return await resp.text()
async def main(loop):
async with aiohttp.ClientSession(loop=loop) as client:
html = await fetch(client)
print(html)
other_html = await post_data(client)
print(other_html)
loop = asyncio.get_event_loop()
loop.run_until_complete(main(loop))
Now I'd like to see status code, url, headers and all for all requests made, so output in logs looking like this:
2017-08-09 08:44:30 DEBUG (200) <GET http://httpbin.org/get>
2017-08-09 08:44:30 DEBUG (200) <POST http://httpbin.org/post>
I know I could add logger.log() call after every request, but that'll be duplication. If I have more requests I will have to write duplicate code under every request making call to logger.log. Seems inefficient.
There is aiohttp.client logger, but there are no details how to set it up.
I'm trying to set it up like this
logger = logging.getLogger('simple_example')
logger.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
logger.addHandler(ch)
logging.getLogger('aiohttp.client').addHandler(ch)
but it doesn't print information that I'd like to see (e.g. response status code, url).
Is there some way to achieve what I need? Perhaps I can subscribe to some signal from client and log some message when signal is sent? E.g. have some mechanism to subscribe to signal sent when client receive response and then log message on that?
At the time of writing (11/2020) you should use tracing functionality of aiohttp:
import asyncio
import logging
import aiohttp
async def on_request_start(session, context, params):
logging.getLogger('aiohttp.client').debug(f'Starting request <{params}>')
async def main():
logging.basicConfig(level=logging.DEBUG)
trace_config = aiohttp.TraceConfig()
trace_config.on_request_start.append(on_request_start)
async with aiohttp.ClientSession(trace_configs=[trace_config]) as session:
await session.get('https://google.com')
asyncio.run(main())
As you can see in aiohttp's code, the aiohttp.client logger is not used to log requests, but can only log a warning if the cookies in the response are invalid https://github.com/aio-libs/aiohttp/search?utf8=%E2%9C%93&q=client_logger&type=
To log every requests you are doing, you will need to create a custom ClientSession
that does what you want. Something like:
class LoggingClientSession(aiohttp.ClientSession):
def request(self, method, url, **kwargs):
logger.debug('Starting request <%s %r>', method, url)
return super().request(method, url, **kwargs)
-
As noted by Jaanus in the comments here, the post
, get
, … helpers now call ClientSession._request
directly, instead of request
. So overriding the later won't intercept calls made with the short-hand helpers.
So you can either:
override _request
instead of request
in your helper
or make sure your code never uses the get
/… helpers and always calls request
directly.
or also define all the helper methods in your LoggingClientSession
-
And as noted by Romuald, _request
is now a coroutine, so overriding it with a regular function will not log at exactly the right time. Here is an updated example:
class LoggingClientSession(aiohttp.ClientSession):
async def _request(self, method, url, **kwargs):
logger.debug('Starting request <%s %r>', method, url)
return await super()._request(method, url, **kwargs)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With