I tried logging with multiprocessing, and found under windows, I will get different root logger in child process, but under Linux that is ok.
The test code:
main.py:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import logging
import multiprocessing
from mymod import func
def m_func():
server = multiprocessing.Process(target=func, args=())
server.start()
logger = logging.getLogger()
#print 'in global main: ', logger
if __name__ == '__main__':
print 'in main: ', logger
m_func()
mymod.py:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import logging
logger = logging.getLogger()
# print 'in global func: ', logger
def func():
print 'in func: ', logger
Under Linux, the result is:
in main: <logging.RootLogger object at 0x10e4d6d90>
in func: <logging.RootLogger object at 0x10e4d6d90>
But under Windows 7, 64 bit, I will get different root logger between main and func:
in main: <logging.RootLogger object at 0x00000000021FFD68>
in func: <logging.RootLogger object at 0x00000000023BC898>
And If I initialize root logger in main scripts, how can I keep the settings such as level in child process under windows?
It seems to me that this could be linked to the following platform-dependant behaviour:
16.6.3.2. Windows Since Windows lacks os.fork() it has a few extra restrictions:
(...)
Global variables
Bear in mind that if code run in a child process tries to access a global variable, then the value it sees (if any) may not be the same as the value in the parent process at the time that Process.start was called.
However, global variables which are just module level constants cause no problems.
From your question, I assume that this results in a logging.basicConfig()
call that is not reaching all your processes. A solution to this is to have your child processes to log to a Queue
(using QueueHandler
), and have a dedicated thread in your main process that will listen to the queue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With