I'm using Python 3.1.4 that is embedded as a scripting environment in an application(x64). So far I have encountered a lot of limitations with the embedded python. I don't know if it is normal or if the programmers of the application have blocked some functionalities.
For example the following code isn't working:
from multiprocessing import Process
def f(name):
print('hello', name)
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
p.join()
# --> error in forking.py: 'module' object has no attribute 'argv'
# print(sys.argv) gives the same error
sys.executable
return the path to the application.
I've tried this as wel:
multiprocessing.forking.set_executable('C:\Python31\python.exe')
multiprocessing.set_executable('C:\Python31\python.exe')
Without success.
Is there a workaround possible ? It is very unlikely that I would have the leverage to make the developers of the application change something in their code.
Thanks
EDIT
I got it to work by adding the following:
sys.argv = ['c:/pathToScript/scipt.py']
I needed this line as well:
multiprocessing.set_executable('C:/Python31/python.exe')
Otherwise an other instance of the application open instead of running the code.
The only problem I have left is that I can't use the methods that control the application itself (like: create_project(), add_report(),..). My primary goal was to be able to call multiple methods without the need to wait for the first one to finish completion. But I think this is just not possible.
Introducing multiprocessing.Pool. Python provides a handy module that allows you to run tasks in a pool of processes, a great way to improve the parallelism of your program. (Note that none of these examples were tested on Windows; I’m focusing on the *nix platform here.)
Currently multiprocessing makes the assumption that its running in python and not running inside an application. I managed to get multi-processing working on ms-windows, doing some workarounds. sys.executable needs to point to Python executable.
multiprocess probably doesn't work (right now) as you might expect, how I interpreted this statement originally was, "you can call several instances of Blender from a python program using multiprocessing, that way you have control over what ends when and can use multiple CPU's.
A mysterious failure wherein Python’s multiprocessing.Pool deadlocks, mysteriously. The root of the mystery: fork (). A conundrum wherein fork () copying everything is a problem, and fork () not copying everything is also a problem. Some bandaids that won’t stop the bleeding.
By default, sys.argv
is not available in embedded code:
Embedding Python
The basic initialization function is Py_Initialize(). This initializes the table of loaded modules, and creates the fundamental modules builtins, __main__, and sys. It also initializes the module search path (sys.path).
Py_Initialize() does not set the “script argument list” (sys.argv). If this variable is needed by Python code that will be executed later, it must be set explicitly with a call to PySys_SetArgvEx(argc, argv, updatepath) after the call to Py_Initialize()
On Windows, multiprocessing
must spawn new processes from scratch. It uses a command line switch --multiprocessing-fork
to distinguish child processes, and also transmits the original argv
from parent to child.
Assigning sys.argv = ['c:/pathToScript/scipt.py']
before creating subprocesses, like you discovered,
would seem to be a good workaround.
A second relevant piece of documentation is that of multiprocessing.set_executable()
:
Sets the path of the Python interpreter to use when starting a child process. (By default
sys.executable
is used). Embedders will probably need to do some thing like
set_executable(os.path.join(sys.exec_prefix, 'pythonw.exe'))
before they can create child processes. (Windows only)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With