Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to make request without blocking (using asyncio)?

I would like to achieve the following using asyncio:

# Each iteration of this loop MUST last only 1 second
while True:
    # Make an async request

    sleep(1)

However, the only examples I've seen use some variation of

async def my_func():
    loop = asyncio.get_event_loop()
    await loop.run_in_executor(None, requests.get, 'http://www.google.com')

loop = asyncio.get_event_loop()
loop.run_until_complete(my_func())

But run_until_complete is blocking! Using run_until_complete in each iteration of my while loop would cause the loop to block.

I've spent the last couple of hours trying to figure out how to correctly run a non-blocking task (defined with async def) without success. I must be missing something obvious, because something as simple as this should surely be simple. How can I achieve what I have described?

like image 546
Dotl Avatar asked Oct 11 '17 16:10

Dotl


1 Answers

run_until_complete runs the main event loop. It's not "blocking" so to speak, it just runs the event loop until the coroutine you passed as a parameter returns. It has to hang because otherwise, the program would either stop or be blocked by the next instructions.

It's pretty hard to tell what you are willing to achieve, but this piece code actually does something:

async def my_func():
    loop = asyncio.get_event_loop()
    while True:
        res = await loop.run_in_executor(None, requests.get, 'http://www.google.com')
        print(res)
        await asyncio.sleep(1)
loop = asyncio.get_event_loop()
loop.run_until_complete(my_func())

It will perform a GET request on Google homepage every seconds, popping a new thread to perform each request. You can convince yourself that it's actually non-blocking by running multiple requests virtually in parallel:

async def entrypoint():
    await asyncio.wait([
        get('https://www.google.com'),
        get('https://www.stackoverflow.com'),
    ])

async def get(url):
    loop = asyncio.get_event_loop()
    while True:
        res = await loop.run_in_executor(None, requests.get, url)
        print(url, res)
        await asyncio.sleep(1)

loop = asyncio.get_event_loop()
loop.run_until_complete(entrypoint())

Another thing to notice is that you're running requests in separate threads each time. It works, but it's sort of a hack. You should rather be using a real asynchronus HTTP client such as aiohttp.

like image 90
user1527491 Avatar answered Sep 28 '22 05:09

user1527491