Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to limit the number of concurrent processes using subprocess module in asyncio python

import asyncio
import asyncio.subprocess
args="blah blah argument "     
create=asyncio.create_subprocess_shell(args,stdout=asyncio.subprocess.PIPE)
proc = await create
output= await proc.stdout.read( )

This is a part of my server code , which gets 1000s of parallel hits from clients.Now how should i limit the maximum number of subprocesses created by the server to run the argument blah blah . As this is code is using 100% of my cpu. I need to deploy other servers on smae cpu

like image 682
Aravind Avatar asked Dec 03 '25 07:12

Aravind


1 Answers

asyncio.Semaphore is a way of limiting internal counter of simultaneous jobs:

sem = asyncio.Semaphore(10)

async def do_job(args):
    async with sem:  # Don't run more than 10 simultaneous jobs below
        proc = await asyncio.create_subprocess_shell(args, stdout=PIPE)
        output = await proc.stdout.read()
        return output

Note, you should be sure count of jobs doesn't increase much faster then you can actually do them. Otherwise, you'll need something more complex than that.

like image 193
Mikhail Gerasimov Avatar answered Dec 05 '25 23:12

Mikhail Gerasimov



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!