Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python debug tools for multiprocessing [closed]

I have a python script that works with threads, processes, and connections to a database. When I run my script, python crashes.

I cannot explicitly detect the case in which this happens.

Now I am looking for tools to get more information when python crashes, or a viewer to see all my created processes/connections.

like image 888
bob morane Avatar asked Nov 23 '12 21:11

bob morane


People also ask

How do I know if multiprocessing is working in Python?

We can check if a process is alive via the multiprocessing. Process. is_alive() method.

How do I get data back from multiprocessing in Python?

Items can be retrieved from the queue by calls to get(). By default, the call to get() will block until an item is available to retrieve from the queue and will not use a timeout. You can learn more about multiprocessing queues in the tutorial: Multiprocessing Queue in Python.

How does multiprocess work in Python?

multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.

How do you terminate a process in multiprocessing?

A process can be killed by calling the Process. terminate() function. The call will only terminate the target process, not child processes. The method is called on the multiprocessing.


1 Answers

I created a module RemoteException.py that shows the full traceback of a exception in a process. Python2. Download it and add this to your code:

import RemoteException

@RemoteException.showError
def go():
    raise Exception('Error!')

if __name__ == '__main__':
    import multiprocessing
    p = multiprocessing.Pool(processes = 1)
    r = p.apply(go) # full traceback is shown here

OLD ANSWER

I had the problem, too.

this is what i did... a RemoteException to debug multiprocessing calls

RemoteException.py

copy the source and remove line 19: file.write('\nin %s ' % (Process.thisProcess,)) and the line import Process

The problem is: multiprocessing only transfers the exception but looses the traceback. The code below creates an Exception object that saves the traceback. And prints it in the calling process.

In your script you can do something like this:

import RemoteException

def f():
    try:
        # here is code that fails but you do know not where
        pass
    except:
        ty, err, tb = RemoteException.exc_info() # like sys.exc_info but with better message
        raise ty, err, tb

# here follows your multiprocessing call to f
like image 72
User Avatar answered Oct 03 '22 16:10

User