I've been tasked with splitting a process into many smaller processes running in parallel, distributed to a number of slave machines. The request comes in via HTTP, the server splits it up into a number of child processes issued to slave machines, waits for all the slave requests to return responses, and then collates the aggregate results into a single data object returned as the result of the top-level request. I thought node.js would be useful, but since it's single threaded I can't quite tell if this is possible, or if it will block waiting for each request to come back before moving on to the next one. Is this possible with node.js? If so, could someone point me in the right direction? i.e. a node module to use or outline how it would be done?
Thanks for any help.
You can spawn child processes manually.
var util = require('util'),
spawn = require('child_process').spawn,
node = spawn('node', ['slave.js']);
node.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
node.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
node.on('exit', function (code) {
console.log('child process exited with code ' + code);
});
node.stdin.write("do my work!");
We are anticipating a Web Worker API being implemented to handle child processes on a higher abstraction
You want to read about WebWorkers. Specifically you should read http://developer.yahoo.com/blogs/ydn/posts/2010/07/multicore_http_server_with_nodejs/
Try to look at node.js cluster module which is a multi-core server manager.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With