Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

From subprocess.Popen to multiprocessing

I got a function that invokes a process using subprocess.Popen in the following way:

    def func():
        ...
        process = subprocess.Popen(substr, shell=True, stdout=subprocess.PIPE)

        timeout = {"value": False}
        timer = Timer(timeout_sec, kill_proc, [process, timeout])
        timer.start()

        for line in process.stdout:
            lines.append(line)

        timer.cancel()
        if timeout["value"] == True:
            return 0
        ...

I call this function from other function using a loop (e.g from range(1,100) ) , how can I make multiple calls to the function with multiprocessing? that each time several processes will run in parallel

The processes doesn't depend on each other, the only constraint is that each process would be 'working' on only one index (e.g no two processes will work on index 1)

Thanks for your help

like image 858
Dor Cohen Avatar asked Feb 06 '23 23:02

Dor Cohen


1 Answers

Just add the index to your Popen call and create a worker pool with as many CPU cores you have available.

import multiprocessing

def func(index):
    ....
    process = subprocess.Popen(substr + " --index {}".format(index), shell=True, stdout=subprocess.PIPE)
    ....

if __name__ == '__main__':
    p = multiprocessing.Pool(multiprocessing.cpu_count())
    p.map(func, range(1, 100))
like image 184
Maximilian Peters Avatar answered Feb 08 '23 15:02

Maximilian Peters