Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

running many parallel http requests using node.js

I've been tasked with splitting a process into many smaller processes running in parallel, distributed to a number of slave machines. The request comes in via HTTP, the server splits it up into a number of child processes issued to slave machines, waits for all the slave requests to return responses, and then collates the aggregate results into a single data object returned as the result of the top-level request. I thought node.js would be useful, but since it's single threaded I can't quite tell if this is possible, or if it will block waiting for each request to come back before moving on to the next one. Is this possible with node.js? If so, could someone point me in the right direction? i.e. a node module to use or outline how it would be done?

Thanks for any help.

like image 942
Jake Avatar asked Apr 15 '11 08:04

Jake


3 Answers

You can spawn child processes manually.

var util  = require('util'),
    spawn = require('child_process').spawn,
    node = spawn('node', ['slave.js']);

node.stdout.on('data', function (data) {
  console.log('stdout: ' + data);
});

node.stderr.on('data', function (data) {
  console.log('stderr: ' + data);
});

node.on('exit', function (code) {
  console.log('child process exited with code ' + code);
});

node.stdin.write("do my work!");

We are anticipating a Web Worker API being implemented to handle child processes on a higher abstraction

like image 79
Raynos Avatar answered Oct 03 '22 23:10

Raynos


You want to read about WebWorkers. Specifically you should read http://developer.yahoo.com/blogs/ydn/posts/2010/07/multicore_http_server_with_nodejs/

like image 30
clee Avatar answered Oct 03 '22 21:10

clee


Try to look at node.js cluster module which is a multi-core server manager.

like image 22
yojimbo87 Avatar answered Oct 03 '22 22:10

yojimbo87