Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pyserial with multiprocessing gives me a ctype error

Hi I'm trying to write a module that lets me read and send data via pyserial. I have to be able to read the data in parallel to my main script. With the help of a stackoverflow user, I have a basic and working skeleton of the program, but when I tried adding a class I created that uses pyserial (handles finding port, speed, etc) found here I get the following error:

File "<ipython-input-1-830fa23bc600>", line 1, in <module>
    runfile('C:.../pythonInterface1/Main.py', wdir='C:/Users/Daniel.000/Desktop/Daniel/Python/pythonInterface1')

  File "C:...\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
    execfile(filename, namespace)

  File "C:...\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "C:/Users/Daniel.000/Desktop/Daniel/Python/pythonInterface1/Main.py", line 39, in <module>
    p.start()

  File "C:...\Anaconda3\lib\multiprocessing\process.py", line 112, in start
    self._popen = self._Popen(self)

  File "C:...\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)

  File "C:...\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)

  File "C:...\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
    reduction.dump(process_obj, to_child)

  File "C:...\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)

ValueError: ctypes objects containing pointers cannot be pickled

This is the code I am using to call the class in SerialConnection.py

import multiprocessing 
from time import sleep
from operator import methodcaller

from SerialConnection import SerialConnection as SC

class Spawn:
    def __init__(self, _number, _max):
        self._number = _number
        self._max = _max
        # Don't call update here

    def request(self, x):
        print("{} was requested.".format(x))

    def update(self):
        while True:
            print("Spawned {} of {}".format(self._number, self._max))
            sleep(2)

if __name__ == '__main__':
    '''
    spawn = Spawn(1, 1)  # Create the object as normal
    p = multiprocessing.Process(target=methodcaller("update"), args=(spawn,)) # Run the loop in the process
    p.start()
    while True:
        sleep(1.5)
        spawn.request(2)  # Now you can reference the "spawn"
    '''
    device = SC()
    print(device.Port)
    print(device.Baud)
    print(device.ID)
    print(device.Error)
    print(device.EMsg)
    p = multiprocessing.Process(target=methodcaller("ReadData"), args=(device,)) # Run the loop in the process
    p.start()
    while True:
        sleep(1.5)
        device.SendData('0003')

What am I doing wrong for this class to be giving me problems? Is there some form of restriction to use pyserial and multiprocessing together? I know it can be done but I don't understand how...

here is the traceback i get from python

Traceback (most recent call last):   File "C:...\Python\pythonInterface1\Main.py", line 45, in <module>
    p.start()

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
    reduction.dump(process_obj, to_child)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj) ValueError: ctypes objects containing pointers cannot be pickled
like image 879
user169808 Avatar asked Aug 24 '19 17:08

user169808


2 Answers

You are trying to pass a SerialConnection instance to another process as an argument. For that python has first to serialize (pickle) the object, and it is not possible for SerialConnection objects.

As said in Rob Streeting's answer, a possible solution would be to allow the SerialConnection object to be copied to the other process' memory using the fork that occurs when multiprocessing.Process.start is invoked, but this will not work on Windows as it does not use fork.

A simpler, cross-platform and more efficient way to achieve parallelism in your code would be to use a thread instead of a process. The changes to your code are minimal:

import threading
p = threading.Thread(target=methodcaller("ReadData"), args=(device,))
like image 164
kmaork Avatar answered Oct 26 '22 14:10

kmaork


I think the problem is due to something inside device being unpicklable (i.e., not serializable by python). Take a look at this page to see if you can see any rules that may be broken by something in your device object.

So why does device need to be picklable at all?

When a multiprocessing.Process is started, it uses fork() at the operating system level (unless otherwise specified) to create the new process. What this means is that the whole context of the parent process is "copied" over to the child. This does not require pickling, as it's done at the operating system level.

(Note: On unix at least, this "copy" is actually a pretty cheap operation because it used a feature called "copy-on-write". This means that both parent and child processes actually read from the same memory until one or the other modifies it, at which point the original state is copied over to the child process.)

However, the arguments of the function that you want the process to take care of do have to be pickled, because they are not part of the main process's context. So, that includes your device variable.

I think you might be able to resolve your issue by allowing device to be copied as part of the fork operation rather than passing it in as a variable. To do this though, you'll need a wrapper function around the operation you want your process to do, in this case methodcaller("ReadData"). Something like this:

if __name__ == "__main__":

    device = SC()

    def call_read_data():
        device.ReadData()

    ...

    p = multiprocessing.Process(target=call_read_data) # Run the loop in the process
    p.start()
like image 45
Rob Streeting Avatar answered Oct 26 '22 13:10

Rob Streeting