I am using an API and sometimes it returns some odd status codes which could be fixed by simply retrying the same request. I am using aiohttp to do submit requests to this api asynchronously.
I am also using the backoff library to retry requests, however it appears that requests are still not being retried upon 401 status responses.
@backoff.on_exception(backoff.expo, aiohttp.ClientError, max_tries=11, max_time=60)
async def get_user_timeline(self, session, user_id, count, max_id, trim_user, include_rts, tweet_mode):
params = {
'user_id': user_id,
'trim_user': trim_user,
'include_rts': include_rts,
'tweet_mode': tweet_mode,
'count': count
}
if (max_id and max_id != -1):
params.update({'max_id': max_id})
headers = {
'Authorization': 'Bearer {}'.format(self.access_token)
}
users_lookup_url = "/1.1/statuses/user_timeline.json"
url = self.base_url + users_lookup_url
async with session.get(url, params=params, headers=headers) as response:
result = await response.json()
response = {
'result': result,
'status': response.status,
'headers': response.headers
}
return response
I would like all requests to be retired up to 10 times if the response has a status code other than 200 or 429.
It doesn't "block" other code from running so we can call it "non-blocking" code. The asyncio library provides a variety of tools for Python developers to do this, and aiohttp provides an even more specific functionality for HTTP requests.
RetryOptions in methods override RetryOptions defined in RetryClient constructor. You can define your own timeouts logic or use: RetryClient add current attempt number to request_trace_ctx (see examples, for more info see aiohttp doc ). You can change URL between retries by specifying url as list of urls. Example:
The asyncio library provides a variety of tools for Python developers to do this, and aiohttp provides an even more specific functionality for HTTP requests.
This current version is 2.0+. It hasn't backward compatibility for previous versions. You still can use v1.2 (pip install aiohttp-retry==1.2), but it is unsupported. You can also add some logic, F.E. logging, on failures by using trace mechanic.
I made a simple library, that can help you:
https://github.com/inyutin/aiohttp_retry
Code like this should solve your problem:
from aiohttp import ClientSession
from aiohttp_retry import RetryClient
statuses = {x for x in range(100, 600)}
statuses.remove(200)
statuses.remove(429)
async with ClientSession() as client:
retry_client = RetryClient(client)
async with retry_client.get("https://google.com", retry_attempts=10, retry_for_statuses=statuses) as response:
text = await response.text()
print(text)
await retry_client.close()
Instead google.com
use your own url
By default aiohttp doesn't raise exception for non-200 status. You should change it passing raise_for_status=True
(doc):
async with session.get(url, params=params, headers=headers, raise_for_status=True) as response:
It should raise exception for any statuses 400 or higher and thus trigger backoff
.
Codes 2xx shouldn't be probably retried since these aren't errors.
Anyway if you still want to raise for "other than 200 or 429" you can do it manually:
if response.status not in (200, 429,):
raise aiohttp.ClientResponseError()
Mikhail's answer cover how to raise exception for status codes 4XX and 5XX, but if you want your coroutine to be retried for these status codes, take a look at async_retrying
library - https://pypi.org/project/async_retrying/
A simple example is given below.
import asyncio
from aiohttp import ClientSession
from async_retrying import retry
@retry(attempts=2)
async def hit_url(url, session: ClientSession):
async with session.get(url) as response:
print("Calling URL : %s", url)
await response.text()
return response.status
async def main():
urls = [
"https://google.com",
"https://yahoo.com"
]
api_calls = []
async with ClientSession(raise_for_status=True) as session:
for url in urls:
api_calls.append(hit_url(session=session, url=url))
await asyncio.gather(*api_calls, return_exceptions=False)
asyncio.run(main())
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With