Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TypeError: An asyncio.Future, a coroutine or an awaitable is required

I'm trying to make an asynchronous web scraper using beautifulsoup and aiohttp.This is my initial code to start things.I'm getting a [TypeError: An asyncio.Future, a coroutine or an awaitable is required] and having a hard time figuring out what is wrong with my code.I am new to python and would appreciate any help regarding this.

import bs4
import asyncio
import aiohttp


async def parse(page):
    soup=bs4.BeautifulSoup(page,'html.parser')
    soup.prettify()
    print(soup.title)



async def request():
    async with aiohttp.ClientSession() as session:
        async with session.get("https://google.com") as resp:
            await parse(resp)



loop=asyncio.get_event_loop()
loop.run_until_complete(request)

Traceback:-

Traceback (most recent call last):
  File "C:\Users\User\Desktop\Bot\aio-req\parser.py", line 21, in <module>
    loop.run_until_complete(request)
  File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\base_events.py", line 591, in run_until_complete
    future = tasks.ensure_future(future, loop=self)
  File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\tasks.py", line 673, in ensure_future
    raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required
like image 405
user7657046 Avatar asked Dec 25 '19 18:12

user7657046


1 Answers

One issue is that loop.run_until_complete(request) should be loop.run_until_complete(request()) - You actually have to call it for it to return a coroutine.

There are further problems - like you are passing an aiohttp.ClientResponse object to parse and treating it as text/html. I got it to work with the following but don't know if it fits your needs because parse is no longer a coroutine.

def parse(page):
    soup=bs4.BeautifulSoup(page,'html.parser')
    soup.prettify()
    return soup.title

async def fetch(session, url):
    async with session.get(url) as response:
        return await response.text()

async def request():
    async with aiohttp.ClientSession() as session:
        html = await fetch(session, "https://google.com")
        print(parse(html))

if __name__ == '__main__':
    loop=asyncio.get_event_loop()
    loop.run_until_complete(request())

This also works:

def parse(page):
    soup=bs4.BeautifulSoup(page,'html.parser')
    soup.prettify()
    print(soup.title)

async def request():
    async with aiohttp.ClientSession() as session:
        async with session.get("https://google.com") as resp:
            parse(await resp.text())

And finally, your original code, passing an awaitable response object to parse then awaiting for page.text().

async def parse(page):
    soup=bs4.BeautifulSoup(await page.text(),'html.parser')
    soup.prettify()
    print(soup.title)

async def request():
    async with aiohttp.ClientSession() as session:
        async with session.get("https://google.com") as resp:
            await parse(resp)
like image 178
wwii Avatar answered Oct 13 '22 05:10

wwii