Is there a way to spawn an interactive python console (preferably iPython) during program execution without pausing main program and be able to check and modify program variables? Something similar to what browsers offer for JavaScript.
I know about pdb.set_trace()
and IPython.embed()
, but both of them pause program execution and require to place them somewhere in source code of the program.
This would be extremaly useful for desktop game development in python.
If you're only interested in debugging a Python script, the simplest way is to select the down-arrow next to the run button on the editor and select Debug Python File in Terminal.
Debugging in Python is facilitated by pdb module (python debugger) which comes built-in to the Python standard library. It is actually defined as the class Pdb which internally makes use of bdb(basic debugger functions) and cmd (support for line-oriented command interpreters) modules.
IDLE has a debugger built into it. It is very useful for stepping through a program and watching the variables change values. Start IDLE and open this program source file. Notice that the Shell shows "[DEBUG ON]".
You could roll-your-own somewhat with threading
:
#!/usr/bin/python3
def _spawn_background_interpreter(*args,**kwargs):
from threading import Thread
def _open_interp(locs):
import code
code.interact(local=locs)
locs = args[0] if args else None
t = Thread(target=_open_interp, args=(locs,))
t.setDaemon(True) #pre-3.3 API
t.start()
Call with _spawn_background_interpreter(locals())
.
I haven't tested it, but this will probably be fine if your program doesn't continuously print things to the console - otherwise it will be all munged together with the interactive interpreter.
The "opening a new console" idea is interesting, but very environment-specific, so I won't tackle that. I would be interested if there's a better prepackaged solution out there.
Edit: an attempt at a multiprocessing
solution:
def _spawn_background_interpreter(*args,**kwargs):
from multiprocessing import Process
import sys, os
def _open_interp(locs,stdin):
import code
sys.stdin = os.fdopen(stdin)
code.interact(local=locs)
locs = args[0] if args else None
fileno = sys.stdin.fileno()
p = Process(target=_open_interp, args=(locs,fileno))
p.daemon = True
p.start()
The reason I initially avoided multiprocessing
is that each new process gets its own PID (and stdin). Thus, I had to pass the main thread's stdin to the child process, and things get a little hacky from there. NOTE that there is a bug in python 3.2 and lower that will cause tracebacks to spew any time you call exit()
in a multiprocessing
process. This is fixed in 3.3.
Unfortunately, the multiprocessing
code only runs on POSIX-compliant systems - i.e. not on Windows. Not insurmountable, just going to require a more involved solution involving pipes.
Anyway the multiprocessing
implementation is likely going to perform better for you if you're approaching 100% CPU utilization in your main thread. Give it a try if you're on *nix.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With