In the subprocess Python 2 module, Popen can be given an env
.
Seems that the equivalent way to do it with Process in multiprocessing module is to pass the env
dictionnary in args
or kwargs
, and then use os.environ['FOO'] = value
in the target
.
Is it the right way?
Is it safe? I mean, no risk that the environment in the parent process or other child processes can be modified?
Here is an example (that works).
import multiprocessing
import time
import os
def target(someid):
os.environ['FOO'] = "foo%i" % someid
for i in range(10):
print "Job %i: " % someid, os.environ['FOO']
time.sleep(1)
if __name__ == '__main__':
processes = []
os.environ['FOO'] = 'foo'
for someid in range(3):
p = multiprocessing.Process(target=target, args=(someid,))
p.start()
processes.append(p)
for i in range(10):
print "Parent: ", os.environ['FOO']
time.sleep(1)
for p in processes:
p.join()
With python code, environment variables can be set and manipulated. Setting the environment variable with code makes it more secure and it does not affect the running python script.
To set and get environment variables in Python you can just use the os module: import os # Set environment variables os. environ['API_USER'] = 'username' os. environ['API_PASSWORD'] = 'secret' # Get environment variables USER = os.
In this example, at first we import the Process class then initiate Process object with the display() function. Then process is started with start() method and then complete the process with the join() method. We can also pass arguments to the function using args keyword.
Daemonize and scale your Python apps In Unix speak, a Daemon is a long-running background process that can perform virtually anything, from executing requests for services to performing any, usually long-running, arbitrary tasks for day-to-day activities on UNIX systems.
Yes, that's the right way to do it. While the child will inherit its initial environment from the parent, subsequent changes to os.environ
made in the child will not affect the parent, and vice-versa:
import os
import multiprocessing
def myfunc(q):
print "child: " + os.environ['FOO']
os.environ['FOO'] = "child_set"
print "child new: " + os.environ['FOO']
q.put(None)
q.get()
print "child new2: " + os.environ['FOO']
if __name__ == "__main__":
os.environ['FOO'] = 'parent_set'
q = multiprocessing.Queue()
proc = multiprocessing.Process(target=myfunc, args=(q,))
proc.start()
q.get()
print "parent: " + os.environ['FOO']
os.environ['FOO'] = "parent_set_again"
q.put(None)
Output:
child start: parent_set
child after changing: child_set
parent after child changing: parent_set
child after parent changing: child_set
If you need to pass an initial environment to the child, you would just pass it in the args
or kwargs
list:
def myfunc(env=None):
time.sleep(3)
if env is not None:
os.environ = env
print os.environ['FOO']
if __name__ == "__main__":
child_env = os.environ.copy()
for i in range(3):
child_env['FOO'] = "foo%s" % (i,)
proc = multiprocessing.Process(target=myfunc, kwargs ={'env' : child_env})
proc.start()
Output:
foo0
foo1
foo2
Note that if you're using a multiprocessing.Pool
, you can use the initializer
/initargs
keyword arguments to just set the correct environment once at the start of each process in the pool:
def init(env):
os.environ = env
def myfunc():
print os.environ['FOO']
if __name__ == "__main__":
child_env = os.environ.copy()
child_env['FOO'] = "foo"
pool = multiprocessing.Pool(initializer=init, initargs=(child_env,))
pool.apply(myfunc,())
Output:
foo
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With