Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiprocessing python within frozen script

I am trying to compile a script utilizing multiprocessing into a Windows executable. At first I ran into the same issue as Why python executable opens new window instance when function by multiprocessing module is called on windows when I compiled it into an executable. Following the accepted answer I adjusted my script such that

from multiprocessing import freeze_support
# my functions
if __name__ == "__main__":
    freeze_support()
    # my script

And this again works perfectly when run as a script. However, when I compile and run it I encounter:

Error message

Where I've underlined in green part of the error. This specific line refers to

freeze_support()

in my script. Furthermore, it is not actually encountered on this line, but when my script goes to multiprocess which is something like:

p = multiprocessing.Process(target=my_function, args=[my_list])
p.start()
p1 = multiprocessing.Process(target=my_function, args=[my_list])
p1.start()
p.join()
p1.join()

Is this an error in the multiprocessing module (specifically line 148) or am I misunderstanding the answer I linked, or something else?

I'll also note that the script does work correctly when compiled, but you have to click "OK" on an error message for every multiprocess that is spawned (quite a lot) and every error message is exactly the same. Would this mean i am improperly ending the process with the p.join()?

I've also tried the solution at Python 3.4 multiprocessing does not work with py2exe which recommends adding

multiprocessing.set_executable(os.path.join(sys.exec_prefix, 'pythonw.exe'))

to your script, yet this causes an error in the script form (not even compiled yet) of:

FileNotFoundError: [WinError 2] The system cannot find the file specified

Thanks for the help!

freeze_support documentation: https://docs.python.org/2/library/multiprocessing.html#multiprocessing.freeze_support

like image 663
Reedinationer Avatar asked Mar 23 '19 00:03

Reedinationer


People also ask

How lock in multiprocessing Python?

Python provides a mutual exclusion lock for use with processes via the multiprocessing. Lock class. An instance of the lock can be created and then acquired by processes before accessing a critical section, and released after the critical section. Only one process can have the lock at any time.

What is multiprocessing Freeze_support?

multiprocessing. freeze_support() This function will allow a frozen program to create and start new processes via the multiprocessing. Process class when the program is frozen for distribution on Windows.

Does Python multiprocessing shared memory?

shared_memory — Shared memory for direct access across processes. New in version 3.8. This module provides a class, SharedMemory , for the allocation and management of shared memory to be accessed by one or more processes on a multicore or symmetric multiprocessor (SMP) machine.

How do you exit multiprocessing in Python?

Call kill() on Process The method is called on the multiprocessing. Process instance for the process that you wish to terminate.

What is multiprocessing in Python?

Python provides a multiprocessing module that includes an API, similar to the threading module, to divide the program into multiple processes. Let us see an example, print("END!")

How to divide a program into multiple processes in Python?

In a multiprocessing system, the applications are broken into smaller routines and the OS gives threads to these processes for better performance. Python provides a multiprocessing module that includes an API, similar to the threading module, to divide the program into multiple processes. Let us see an example, print("END!")

What is the current_process () method in Python?

As you can see, the current_process () method gives us the name of the process that calls our function. See what happens when we don’t assign a name to one of the processes: Well, the Python Multiprocessing Module assigns a number to each process as a part of its name when we don’t.

How to lock and resume processes in Python multiprocessing?

The lock is used for locking the processes while using multiprocessing in Python. With its acquire () and release () methods, you can lock and resume processes. Thus, it allows you to execute specific tasks based on priority while stopping the other processes.


Video Answer


1 Answers

This is not an issue of the multiprocessing library or py2exe per se but a side effect of the way you run the application. The py2exe documentation contains some discussion on this topic:

A program running under Windows can be of two types: a console program or a windows program. A console program is one that runs in the command prompt window (cmd). Console programs interact with users using three standard channels: standard input, standard output and standard error […].

As opposed to a console application, a windows application interacts with the user using a complex event-driven user interface and therefore has no need for the standard channels whose use in such applications usually results in a crash.

Py2exe will work around these issues automatically in some cases, but at least one of your processes has no attached standard output: sys.stdout is None), which means that sys.stdout.flush() is None.flush(), which yields the error you are getting. The documentation linked above has an easy fix that redirects all outputs to files.

import sys
sys.stdout = open(“my_stdout.log”, “w”)
sys.stderr = open(“my_stderr.log”, “w”)

Simply add those lines at the entry point of your processes. There is also a relevant documentation page on the interactions between Py2Exe and subprocesses.

like image 83
Jan-Gerd Avatar answered Oct 04 '22 09:10

Jan-Gerd