Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Forming numpy array from array buffer from shared memory (multiprocessing) fails

I need to have a multidimensional array in a shared memory between two processes. I'm trying to make a simple example that works: I send [1, 2, 3, 4, 5, 6, 7, 8, 9] to the other process, which reshapes it into [[1, 2, 3], [4, 5, 6], [7, 8, 9]] without taking additional memory.

import multiprocessing
import ctypes
import numpy


def f(array):
    nmp = numpy.frombuffer(array.get_obj(), dtype=int)
    b = nmp.reshape((3, 3))


if __name__ == '__main__':
    time_array = []
    import common_lib
    common_lib.time_usage(time_array)
    arr = multiprocessing.Array(ctypes.c_int, [1,2,3,4,5,6,7,8,9])
    p = multiprocessing.Process(target=f, args=(arr,))
    p.start()
    p.join()

I did exactly as was in the manuals. But the function frombuffer gives this error:

ValueError: buffer size must be a multiple of element size

like image 389
soshial Avatar asked Apr 10 '14 22:04

soshial


1 Answers

The dtype for the numpy array needs to be explicitly set as a 32-bit integer.

nmp = numpy.frombuffer(array.get_obj(), dtype="int32")

If you are on a 64-bit machine, it is likely that you were trying to cast the 32-bit ctypes array as a 64-bit numpy array.

like image 172
ebarr Avatar answered Oct 23 '22 23:10

ebarr