This question is related to a previous question I asked, and it seems like a simple question, but I have a hard time finding useful information or tutorials about the topic of multiprocessing.
My problem is that I would like to combine the produced data into one big array and then store it in my hdf file.
def Simulation(i, output):
# make a simulation which outputs it resutlts in A. with shape 4000,3
A = np.array([4000,3])
output.put(A)
def handle_output(output):
hdf = pt.openFile('simulation.h5',mode='w')
hdf.createGroup('/','data')
# Here the output should be joined somehow.
# I would like to get it in the shape [4000,3,10]
output.get()
hdf.createArray('/data','array',A)
hdf.close()
if __name__ == '__main__':
output = mp.Queue()
jobs = []
proc = mp.Process(target=handle_output, args=(output, ))
proc.start()
for i in range(10):
p = mp.Process(target=Simulation, args=(i, output))
jobs.append(p)
p.start()
for p in jobs:
p.join()
output.put(None)
proc.join()
Python multiprocessing joinThe join method blocks the execution of the main process until the process whose join method is called terminates. Without the join method, the main process won't wait until the process gets terminated. The example calls the join on the newly created process.
Use Pool. The multiprocessing pool starmap() function will call the target function with multiple arguments. As such it can be used instead of the map() function. This is probably the preferred approach for executing a target function in the multiprocessing pool that takes multiple arguments.
Python multiprocessing Pool can be used for parallel execution of a function across multiple input values, distributing the input data across processes (data parallelism). Below is a simple Python multiprocessing Pool example.
What you really need is a multiprocessing Pool
Just do something like:
def Simulation(i):
return output
p = mp.Pool(16)
result = p.map(Simulation,range(10))
result = np.array(result).reshape(...)
p.close()
p.join()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With