In Node.js I'm using the exec command of the child_process module to call an algorithm in Java that returns a large amount of text to standard out which I then parse and use. I'm able to capture it mostly, but when it exceeds a certain number of lines, the content is cutoff.
exec("sh target/bin/solver "+fields.dimx+" "+fields.dimy, function(error, stdout, stderr){
//do stuff with stdout
}
I've tried using setTimeouts and callbacks but haven't succeeded but I do feel this is occurring because I'm referencing stdout in my code before it can be retrieved completely. I have tested that stdout is in-fact where the data loss first occurs. It's not an asynchronous issue further down the line. I've also tested this on my local machine and Heroku, and the exact same issue occurs, truncating at the exact same line number every time.
Any ideas or suggestions as to what might help with this?
The real (and best) solution to this problem is to use spawn instead of exec. As stated in this article, spawn is more suited for handling large volumes of data :
child_process.exec returns the whole buffer output from the child process. By default the buffer size is set at 200k. If the child process returns anything more than that, you program will crash with the error message "Error: maxBuffer exceeded". You can fix that problem by setting a bigger buffer size in the exec options. But you should not do it because exec is not meant for processes that return HUGE buffers to Node. You should use spawn for that. So what do you use exec for? Use it to run programs that return result statuses, instead of data.
spawn requires a different syntax than exec :
var proc = spawn('sh', ['target/bin/solver', 'fields.dimx', 'fields.dimy']);
proc.on("exit", function(exitCode) {
console.log('process exited with code ' + exitCode);
});
proc.stdout.on("data", function(chunk) {
console.log('received chunk ' + chunk);
});
proc.stdout.on("end", function() {
console.log("finished collecting data chunks from stdout");
});
I had exec.stdout.on('end') callbacks hung forever with @damphat solution.
Another solution is to increase the buffer size in the options of exec: see the documentation here
{ encoding: 'utf8',
timeout: 0,
maxBuffer: 200*1024, //increase here
killSignal: 'SIGTERM',
cwd: null,
env: null }
To quote: maxBuffer specifies the largest amount of data allowed on stdout or stderr - if this value is exceeded then the child process is killed. I now use the following: this does not require handling the separated parts of the chunks separated by commas in stdout, as opposed to the accepted solution.
exec('dir /b /O-D ^2014*', {
maxBuffer: 2000 * 1024 //quick fix
}, function(error, stdout, stderr) {
list_of_filenames = stdout.split('\r\n'); //adapt to your line ending char
console.log("Found %s files in the replay folder", list_of_filenames.length)
}
);
Edited:
I have tried with dir /s
on my computer (windows
) and got the same problem( it look like a bug), this code solve that problem for me:
var exec = require('child_process').exec;
function my_exec(command, callback) {
var proc = exec(command);
var list = [];
proc.stdout.setEncoding('utf8');
proc.stdout.on('data', function (chunk) {
list.push(chunk);
});
proc.stdout.on('end', function () {
callback(list.join());
});
}
my_exec('dir /s', function (stdout) {
console.log(stdout);
})
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With