Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

bash: force exec'd process to have unbuffered stdout

Tags:

I've got a script like:

#!/bin/bash exec /usr/bin/some_binary > /tmp/my.log 2>&1 

Problem is that some_binary sends all of its logging to stdout, and buffering makes it so that I only see output in chunks of a few lines. This is annoying when something gets stuck and I need to see what the last line says.

Is there any way to make stdout unbuffered before I do the exec that will affect some_binary so it has more useful logging?

(The wrapper script is only setting a few environment variables before the exec, so a solution in perl or python would also be feasible.)

like image 396
bstpierre Avatar asked Jul 26 '10 03:07

bstpierre


2 Answers

GNU coreutils-8.5 also has the stdbuf command to modify I/O stream buffering:

http://www.pixelbeat.org/programming/stdio_buffering/

So, in your example case, simply invoke:

stdbuf -oL /usr/bin/some_binary > /tmp/my.log 2>&1 

This will allow text to appear immediately line-by-line (once a line is completed with the end-of-line "\n" character in C). If you really want immediate output, use -o0 instead.

This way could be more desirable if you do not want to introduce dependency to expect via unbuffer command. The unbuffer way, on the other hand, is needed if you have to fool some_binary into thinking that it is facing a real tty standard output.

like image 86
zaga Avatar answered Oct 07 '22 02:10

zaga


You might find that the unbuffer script that comes with expect may help.

like image 35
Dennis Williamson Avatar answered Oct 07 '22 01:10

Dennis Williamson