Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python: how to get the final output of multiple system commands?

There are many posts here on SO, like this one: Store output of subprocess.Popen call in a string

There is problem with complicated commands. For example, if I need to get output from this

ps -ef|grep something|wc -l

Subprocess won't do the job, because argument for subprocess is [name of program, arguments], so it is not possible to use more sophisicated commands (more programs, pipes, etc.).

Is there way to capture the output of a chain of multiple commands?

like image 804
3p1i4 Avatar asked Dec 11 '22 17:12

3p1i4


2 Answers

Just pass the shell=True option to subprocess

import subprocess
subprocess.check_output('ps -ef | grep something | wc -l', shell=True)
like image 176
John La Rooy Avatar answered Dec 21 '22 23:12

John La Rooy


For a no-shell, clean version using the subprocess module, you can use the following example (from the documentation):

output = `dmesg | grep hda`

becomes

p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
p1.stdout.close()  # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]

The Python program essentially does here what the shell does: it sends the output of each command to the next one in turn. An advantage of this approach is that the programmer has full control on the individual standard error outputs of the commands (they can be suppressed if needed, logged, etc.).

That said, I generally prefer to use instead the subprocess.check_output('ps -ef | grep something | wc -l', shell=True) shell-delegation approach suggested by nneonneo: it is general, very legible and convenient.

like image 38
Eric O Lebigot Avatar answered Dec 21 '22 22:12

Eric O Lebigot