I'd like to use python's
multiprocessing
module to utilize a multi-core Linux server.
I need all processes to have read/write access to the same shared memory.
Instead of using a list
or a queue
, is it possible to have a multi-dimentional numpy
array as the shared object?
I found that even if you do not modify your numpy array after fork()'ing a bunch of child processes, you will still see your RAM skyrocket as childprocesses copy-on-write the object for some reason.
You can limit (or totally alleviate?) this problem by setting
"yourArray.flags.writeable = False"
BEFORE fork()'ing/Pool()'ing which seems to keep the RAM used down, and is a LOT less hassle than the other methods :)
I think I know what you're looking for: https://bitbucket.org/cleemesser/numpy-sharedmem/issue/3/casting-complex-ndarray-to-float-in
There's a short description on the web page saying: A shared memory module for numpy by Sturla Molden and G. Varoquaux that makes it easy to share memory between processes in the form of NumPy arrays. Originally posted to SciPy-user mailing list.
I, myself am using it just that way. Sharing NumPy arrays between processes. Works very well for me.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With