Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

increase maximum byte size of nodejs pipe

I'm trying to write a little commander script that proxies to other scripts in my project and I'm trying to use node to pipe the stdout from the spawned process to the current process's stdout:

function runCommand(command, arguments) {
    var commandProcess = childProcess.spawn(command, arguments);
    commandProcess.stdout.pipe(process.stdout);
    commandProcess.on("exit", process.exit);
}

this works fine until I start getting large output from my sub processes (for example one of them is a maven command). What I'm seeing is that it only prints out the first 8192 bytes of the stdout and then stores the rest until the next "data" event. Then it prints out the next 8192 etc. That means there's a lag in the output and at times when we're running a server process sometimes it stops printing things out until you trigger something on the server that triggers another "data" event.

Is there a way to increase the size of this buffer or avoid this behavior? Ideally this commander script just proxies to our other scripts and should print out everything exactly as is.

like image 215
Ben Wong Avatar asked Jan 06 '16 02:01

Ben Wong


1 Answers

You are using node process spawn which is asynchronously asynchronous so it will give output as an when the child process gives stdout. Ref: [https://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options][1]

I recommend to use child process exec to run your process where you can control the size of output buffer, which will give output after the child process is finished.This is how you pass buffer size

var execute = function(command, callback){
    exec(command, {maxBuffer: 1024 * 500}, function(error, stdout, stderr){ callback(error, stdout); });
};
like image 112
Keval Gohil Avatar answered Sep 22 '22 14:09

Keval Gohil