So node is working great for me. I have a very specific server application that basically accepts requests to perform a particular CPU-bound procedure, and it executes a C program to do so. Thing is, if I have multiple clients, it's very likely I'll get multiple versions of the same request. It would be a nice optimization to somehow handle for that explicitly, by implementing a cache with something of a lock on a particular key, so that other clients will simply wait on that request coming back, and copy its response.
But I'm new to node, so I don't know how to rig this into my basic node router-request handler mechanism. Obviously I could do it in language x using basic concurrency primitives, but I know that node is event-oriented and I think this could be done quite elegantly in an evented way. Ideas?
How NodeJS handle multiple client requests? NodeJS receives multiple client requests and places them into EventQueue. NodeJS is built with the concept of event-driven architecture. NodeJS has its own EventLoop which is an infinite loop that receives requests and processes them.
Node.js in Heroku with AWS RDS js instance was capable to handle 31K requests in 60 seconds, which means an average of 515 requests per second.
Several answers above, but none really treating paralel requests to the same resource correctly.
You don't need to worry about concurrency when checking for cache key, since node is a single-threaded environment. All your actions are indeed atomic. All async operations in node will however cause it to accept further requests. Thus you need to handle concurrent overlapping requests, here solved with registering observers to an EventEmmiter:
var http = require('http'), EventEmitter = require('events').EventEmitter;
var cache = {};
http.createServer(function (req, res) {
var key = someMagic(req), cached = cache[key]; // get some unique request identifier
if (!cached) { // if we've never seen this request before
cached = new EventEmitter(); // make this cache entry an event emitter
cached.status = 'running';
handleAsyncRequest(function(result) { // your request handling is probably asynchronous, call this callback when you're done
cached.response = result; // memoize data
cached.status = 'finished';
cached.emit('finished'); // notify all observers waiting for this request
});
} else {
switch(cached.status) { // if existing request, check if it's still running or finished
case 'finished':
res.end(cached.response); // send cached response immediately if request has finished
break;
case 'running':
// subscribe as observer; send response when request is finished
cached.once('finished', function() { res.end(cached.response); });
break;
}
}
}).listen(1337, "127.0.0.1");
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With