Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python Run a daemon sub-process & read stdout

Tags:

python

I need to run a program and gather its output to stdout. This program (socat) needs to run in the background for the duration of the python script. Socat sits in dameon mode once it's run, but first it outputs some lines to stdout that I need for the rest of my script.

Command: socat -d -d PTY: PTY:

Output:

2011/03/23 21:12:35 socat[7476] N PTY is /dev/pts/1
2011/03/23 21:12:35 socat[7476] N PTY is /dev/pts/2
2011/03/23 21:12:35 socat[7476] N starting data transfer loop with FDs [3,3] and [5,5]

...

I basically want to run that at the start of my program and leave it running till script termination, but I need to read the two /dev/pts/X names into python.

Can anyone tell me how to do this?

I came up with this which just hangs, I guess because it's blocking for the child process to terminate.

#!/usr/bin/python
from subprocess import Popen, PIPE, STDOUT

cmd = 'socat -d -d PTY: PTY: &'

p = Popen(cmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, close_fds=True)
output = p.stdout.read()

# Process the output 
print(output)

Thanks for any help

EDIT: Seems it may write to stderr, but the script still just hanges with and without the & even reading from stderr.

like image 796
Jason Avatar asked Mar 23 '11 21:03

Jason


People also ask

How do I run a daemon process in python?

Daemon processes in Python To execute the process in the background, we need to set the daemonic flag to true. The daemon process will continue to run as long as the main process is executing and it will terminate after finishing its execution or when the main program would be killed.

What is sub process in python?

The subprocess module allows you to spawn new processes, connect to their input/output/error pipes, and obtain their return codes. This module intends to replace several older modules and functions: os. system os.

What is the function that can help to start a background daemon process in python?

This can be achieved by creating a new multiprocessing. Process instance and setting the “daemon” argument to True. We can update the previous example to create a daemon process to execute our task() function. The task() function will then report whether it is being executed by a daemon process or not.


2 Answers

#!/usr/bin/python
from subprocess import Popen, PIPE, STDOUT
import pty
import os

cmd = 'socat -d -d PTY: PTY:'

master, slave = pty.openpty()

p = Popen(cmd, shell=True, stdin=PIPE, stdout=slave, stderr=slave, close_fds=True)
stdout = os.fdopen(master)
print stdout.readline()
print stdout.readline()

There are two problems with your version. Firstly, you call read without argument which means it will attempt to read everything. But since socat doesn't terminate, it never decides that it has read everything. By using readline, python only reads until it finds a newline. From my understanding of your problem that is what you need.

The second problem is that the C standard library will buffer outputs on pipes. We solve that by creating a pty with the openpty() function and passing it to both stdout and stderr of the subprocess. We use fdopen to make that file descriptor into a regular python object and we get rid of the buffering.

I don't know what you are doing with the socat, but I wonder whether it could replaced by using the pty module. You are copying one pty to another, and openpty is creating a pair of ptys. Perhaps you can use those directly?

like image 138
Winston Ewert Avatar answered Oct 23 '22 16:10

Winston Ewert


The subprocess probably never closes stdout, so the read() call waits forever. To make matters worse, it will probably buffer its output when it figures out that it's a pipe instead of a console (the standard C library does this automatically, so this isn't a function of how cleverly written the app is). If so, probably the only option is to use an expect-style library such as Pexpect.

like image 20
Marcelo Cantos Avatar answered Oct 23 '22 18:10

Marcelo Cantos