Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Force Node.js to flush writes to child processes

I spawn a child process like this:

var child = require('child_process');
var proc = child.spawn('python', ['my_script.py', '-p', 'example']);

I also set some data handling:

proc.stdin.setEncoding('utf8');
proc.stdout.setEncoding('utf8');
proc.stderr.setEncoding('utf8');

proc.stdout.on('data', function (data) {
  console.log('out: ' + data);
});

proc.stderr.on('data', function (data) {
  console.log('err: ' + data);
});

proc.on('close', function (code) {
  console.log('subprocess exited with status ' + code);
  proc.stdin.end();
});

My Python script reads lines from stdin and for each line does some operations and prints to stdout. It works fine in the shell (I write a line and I get the output immediately) but when I do this in Node:

for (var i = 0; i < 10; i++) {
  proc.stdin.write('THIS IS A TEST\n');
}

I get nothing.

I got to fix it calling proc.stdin.end() but that also terminates the child process (which I want to stay in background, streaming data).

I also triggered a flush filling the buffer with lots of writes, but that's not really an option.

Is there any way to manually flush the stream?

like image 744
kaoD Avatar asked Jun 17 '13 03:06

kaoD


1 Answers

You are not flushing the output from Python after print statement. I had similar problem and @Alfe answered my question. Take a look at this:

Stream child process output in flowing mode

like image 127
iFadey Avatar answered Nov 09 '22 05:11

iFadey