I am trying to learn to use asyncio in Python to optimize scripts. My example returns a coroutine was never awaited
warning, can you help to understand and find how to solve it?
import time import datetime import random import asyncio import aiohttp import requests def requete_bloquante(num): print(f'Get {num}') uid = requests.get("https://httpbin.org/uuid").json()['uuid'] print(f"Res {num}: {uid}") def faire_toutes_les_requetes(): for x in range(10): requete_bloquante(x) print("Bloquant : ") start = datetime.datetime.now() faire_toutes_les_requetes() exec_time = (datetime.datetime.now() - start).seconds print(f"Pour faire 10 requêtes, ça prend {exec_time}s\n") async def requete_sans_bloquer(num, session): print(f'Get {num}') async with session.get("https://httpbin.org/uuid") as response: uid = (await response.json()['uuid']) print(f"Res {num}: {uid}") async def faire_toutes_les_requetes_sans_bloquer(): loop = asyncio.get_event_loop() with aiohttp.ClientSession() as session: futures = [requete_sans_bloquer(x, session) for x in range(10)] loop.run_until_complete(asyncio.gather(*futures)) loop.close() print("Fin de la boucle !") print("Non bloquant : ") start = datetime.datetime.now() faire_toutes_les_requetes_sans_bloquer() exec_time = (datetime.datetime.now() - start).seconds print(f"Pour faire 10 requêtes, ça prend {exec_time}s\n")
The first classic part of the code runs correctly, but the second half only produces:
synchronicite.py:43: RuntimeWarning: coroutine 'faire_toutes_les_requetes_sans_bloquer' was never awaited
done, pending = await asyncio. wait(aws) timeout (a float or int), if specified, can be used to control the maximum number of seconds to wait before returning. Note that this function does not raise TimeoutError . Futures or Tasks that aren't done when the timeout occurs are simply returned in the second set.
Asyncio is a programming design that achieves concurrency without multi-threading. It is a single-threaded, single-process design. It uses cooperative multitasking, i.e., it gives a sense of concurrency despite using a single thread in a single process.
Coroutines are generalizations of subroutines. They are used for cooperative multitasking where a process voluntarily yield (give away) control periodically or when idle in order to enable multiple applications to be run simultaneously.
coroutine" decorator is deprecated since Python 3.8, use "async def" instead · Javascript Required. Kindly enable Javascript. Updates · Content Removed.
You made faire_toutes_les_requetes_sans_bloquer
an awaitable function, a coroutine, by usingasync def
.
When you call an awaitable function, you create a new coroutine object. The code inside the function won't run until you then await on the function or run it as a task:
>>> async def foo(): ... print("Running the foo coroutine") ... >>> foo() <coroutine object foo at 0x10b186348> >>> import asyncio >>> asyncio.run(foo()) Running the foo coroutine
You want to keep that function synchronous, because you don't start the loop until inside that function:
def faire_toutes_les_requetes_sans_bloquer(): loop = asyncio.get_event_loop() # ... loop.close() print("Fin de la boucle !")
However, you are also trying to use a aiophttp.ClientSession()
object, and that's an asynchronous context manager, you are expected to use it with async with
, not just with
, and so has to be run in aside an awaitable task. If you use with
instead of async with
a TypeError("Use async with instead")
exception will be raised.
That all means you need to move the loop.run_until_complete()
call out of your faire_toutes_les_requetes_sans_bloquer()
function, so you can keep that as the main task to be run; you can call and await on asycio.gather()
directly then:
async def faire_toutes_les_requetes_sans_bloquer(): async with aiohttp.ClientSession() as session: futures = [requete_sans_bloquer(x, session) for x in range(10)] await asyncio.gather(*futures) print("Fin de la boucle !") print("Non bloquant : ") start = datetime.datetime.now() loop.run(faire_toutes_les_requetes_sans_bloquer()) exec_time = (datetime.datetime.now() - start).seconds print(f"Pour faire 10 requêtes, ça prend {exec_time}s\n")
I used the new asyncio.run()
function (Python 3.7 and up) to run the single main task. This creates a dedicated loop for that top-level coroutine and runs it until complete.
Next, you need to move the closing )
parenthesis on the await resp.json()
expression:
uid = (await response.json())['uuid']
You want to access the 'uuid'
key on the result of the await
, not the coroutine that response.json()
produces.
With those changes your code works, but the asyncio version finishes in sub-second time; you may want to print microseconds:
exec_time = (datetime.datetime.now() - start).total_seconds() print(f"Pour faire 10 requêtes, ça prend {exec_time:.3f}s\n")
On my machine, the synchronous requests
code in about 4-5 seconds, and the asycio code completes in under .5 seconds.
Do not use loop.run_until_complete
call inside async
function. The purpose for that method is to run an async function inside sync context. Anyway here's how you should change the code:
async def faire_toutes_les_requetes_sans_bloquer(): async with aiohttp.ClientSession() as session: futures = [requete_sans_bloquer(x, session) for x in range(10)] await asyncio.gather(*futures) print("Fin de la boucle !") loop = asyncio.get_event_loop() loop.run_until_complete(faire_toutes_les_requetes_sans_bloquer())
Note that alone faire_toutes_les_requetes_sans_bloquer()
call creates a future that has to be either awaited via explicit await
(for that you have to be inside async
context) or passed to some event loop. When left alone Python complains about that. In your original code you do none of that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With