I'm writing app, which scans directory every second, checks for new files, and if they appeared - sends them via POST request and performs archiving. Assuming that number of files, which can appear in directory can be from 10 to 100 - I decided to use asyncio and aiohttp, to send requests concurrently.
Code:
import os
import aiohttp
from aiohttp.client import ClientSession
BASE_DIR = '/path/to'
ARCHIVE_DIR = '/path/to/archive'
async def scan():
while True:
await asyncio.sleep(1)
for file in os.listdir(BASE_DIR):
if os.path.join(BASE_DIR, file).endswith('jpg'):
asyncio.ensure_future(publish_file(file))
async def publish_file(file):
async with ClientSession(loop=loop) as session:
async with session.post(url=url, data={'photo': open(os.path.join(BASE_DIR, file), 'rb')}) as response:
if response.status == 200:
await move_to_archive(file)
async def move_to_archive(file):
os.rename(os.path.join(BASE_DIR, file), os.path.join(ARCHIVE_DIR, file))
loop = asyncio.get_event_loop()
coros = [
asyncio.ensure_future(scan())
]
loop.run_until_complete(asyncio.wait(coros))
So the question is: If i want to send requests concurrent, is this a good practice to add coroutines to loop like this : asyncio.ensure_future(publish_file(file))
?
Yes, it's correct.
P.S. Better to share the same session (perhaps with limited amount of parallel connections) than recreate a connection pool on every post request:
session = aiohttp.ClientSession(connector=aiohttp.TCPConnector(limit=10))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With