I'm writing a Swig-Python wrapper for a C++ library. When critical error occurs, the library calls exit(err);, which in turn terminates the whole python script that executes functions from that library.
Is there a way to wrap around exit() function to return to the script or throw an exception?
You can put together a massive hack for this using longjmp and on_exit, although I highly recommend avoiding this in favour of a solution with multiple processes, which I'll outline later in the answer.
Suppose we have the following (broken by design) header file:
#ifndef TEST_H
#define TEST_H
#include <stdlib.h>
inline void fail_test(int fail) {
if (fail) exit(fail);
}
#endif//TEST_H
We want to wrap it and convert the call to exit() into a Python exception instead. One way to achieve this would be something like the following interface, which uses %exception to insert C code around the call to every C function from your Python interface:
%module test
%{
#include "test.h"
#include <setjmp.h>
static __thread int infunc = 0;
static __thread jmp_buf buf;
static void exithack(int code, void *data) {
if (!infunc) return;
(void)data;
longjmp(buf,code);
}
%}
%init %{
on_exit(exithack, NULL);
%}
%exception {
infunc = 1;
int err = 0;
if (!(err=setjmp(buf))) {
$action
}
else {
// Raise exception, code=err
PyErr_Format(PyExc_Exception, "%d", err);
infunc = 0;
on_exit(exithack, NULL);
SWIG_fail;
}
infunc = 0;
}
%include "test.h"
This "works" when we compile it:
swig3.0 -python -py3 -Wall test.i
gcc -shared test_wrap.c -o _test.so -I/usr/include/python3.4 -Wall -Wextra -lpython3.4m
And we can demonstrate it with:
Python 3.4.2 (default, Oct 8 2014, 13:14:40)
[GCC 4.9.1] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import test
>>> test.fail_test(0)
>>> test.fail_test(123)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
Exception: 123
>>> test.fail_test(0)
>>> test.fail_test(999)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
Exception: 999
>>>
It's very ugly though, almost certainly not portable and most likely undefined behaviour as well.
My advice would be not to do that and use a solution with two processes communicating instead. We can still have SWIG help us generate a nice module and better yet we can rely on some high-level Python constructs to help us with that. The complete example looks like:
%module test
%{
#include "test.h"
static void exit_handler(int code, void *fd) {
FILE *f = fdopen((int)fd, "w");
fprintf(stderr, "In exit handler: %d\n", code);
fprintf(f, "(dp0\nVexited\np1\nL%dL\ns.", code);
fclose(f);
}
%}
%typemap(in) int fd %{
$1 = PyObject_AsFileDescriptor($input);
%}
%inline %{
void enter_work_loop(int fd) {
on_exit(exit_handler, (void*)fd);
}
%}
%pythoncode %{
import os
import pickle
serialize=pickle.dump
deserialize=pickle.load
def do_work(wrapped, args_pipe, results_pipe):
wrapped.enter_work_loop(results_pipe)
while True:
try:
args = deserialize(args_pipe)
f = getattr(wrapped, args['name'])
result = f(*args['args'], **args['kwargs'])
serialize({'value':result},results_pipe)
results_pipe.flush()
except Exception as e:
serialize({'exception': e},results_pipe)
results_pipe.flush()
class ProxyModule():
def __init__(self, wrapped):
self.wrapped = wrapped
self.prefix = "_worker_"
def __dir__(self):
return [x.strip(self.prefix) for x in dir(self.wrapped) if x.startswith(self.prefix)]
def __getattr__(self, name):
def proxy_call(*args, **kwargs):
serialize({
'name': '%s%s' % (self.prefix, name),
'args': args,
'kwargs': kwargs
}, self.args[1])
self.args[1].flush()
result = deserialize(self.results[0])
if 'exception' in result: raise result['exception']
if 'exited' in result: raise Exception('Library exited with code: %d' % result['exited'])
return result['value']
return proxy_call
def init_library(self):
def pipes():
r,w=os.pipe()
return os.fdopen(r,'rb',0), os.fdopen(w,'wb',0)
self.args = pipes()
self.results = pipes()
self.worker = os.fork()
if 0==self.worker:
do_work(self.wrapped, self.args[0], self.results[1])
%}
// rename all our wrapped functions to be _worker_FUNCNAME to hide them - we'll call them from within the other process
%rename("_worker_%s") "";
%include "test.h"
%pythoncode %{
import sys
sys.modules[__name__] = ProxyModule(sys.modules[__name__])
%}
Which uses the following ideas:
os.fork to spawn the worker process, with os.fdopen creating a nicer object for use in Python__getattr__ to return proxy functions for the worker process__dir__ to keep TAB working inside ipythonon_exit to intercept the exit (but not divert it) and report the code back via a pre-written ASCII pickled objectYou could make the call to library_init transparent and automatic if you wished so. You also need to handle the case where the worker hasn't been started or has already exited better (it'll just block in my example). And you'll also need to ensure that the worker gets cleaned up at exit properly, but it now lets you run:
Python 3.4.2 (default, Oct 8 2014, 13:14:40)
[GCC 4.9.1] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import test
>>> test.init_library()
>>> test.fail_test(2)
In exit handler: 2
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/mnt/lislan/ajw/code/scratch/swig/pyatexit/test.py", line 117, in proxy_call
if 'exited' in result: raise Exception('Library exited with code: %d' % result['exited'])
Exception: Library exited with code: 2
>>>
and still be (somewhat) portable, but definitely well defined.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With