Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to launch multiple other python scripts all together from one and send them arguments?

Tags:

python

windows

I've to launch and execute 24 independent python scripts on windows 7. I want that one script launches them all at the same time... without ruling them all (I'm not Sauron) or waiting their ends. I find os.startfile() interesting for that. But I did not succeed in sending arguments to those 24.

coincoin1.py (one of the 24 script to be launched)

import sys
print "hello:",sys.argv 

Anti_Sauron_script.py (the one that will launch the 24 all together)

sys.argv=["send","those","arguments"] 
os.startfile("C:\\Users\\coincoin1.py")

How to send arguments to those scripts and launch them all together?

like image 926
sol Avatar asked Jul 20 '11 15:07

sol


1 Answers

You may use an indipendent process (multiprocessing.Process) and using two queues to communicate with it (multiprocessing.Queue) one for the input and the other one for the output. Example on starting the process:

import multiprocessing

def processWorker(input, result):
    work = input.get()
    ## execute your command here
    pipe = subprocess.Popen(command, stdout = subprocess.PIPE,
                             stderr = subprocess.PIPE, shell = True)
    stdout, stderr = pipe.communicate()
    result.put(pipe.returncode)

input  = multiprocessing.Queue()
result = multiprocessing.Queue()

p = multiprocessing.Process(target = processWorker, args = (input, result))
p.start()
commandlist = ['ls -l /', 'ls -l /tmp/']
for command in commandlist:
    input.put(command)
for i in xrange(len(commandlist)):
    res = result.get(block = True)
    if not res is 0:
        print 'One command failed'

Then you may keep track of which command is being executed by each subprocess simply storing the command associated to a workid (the workid can be a counter incremented when the queue get filled with new work). Usage of multiprocessing.Queue is robust since you do not need to rely on stdout/err parsing and you also avoid related limitation. Moreover you can easily manage more subprocesses.

Then, you can also set a timeout on how long you want a get call to wait at max, eg:

import Queue
try:
    res = result.get(block = True, timeout = 10)
except Queue.Empty:
    print error
like image 108
Cinquo Avatar answered Nov 15 '22 03:11

Cinquo