Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to create a cache in node.js that handles explicitly for simultaneous duplicate requests for a CPU-bound operation

Tags:

node.js

So node is working great for me. I have a very specific server application that basically accepts requests to perform a particular CPU-bound procedure, and it executes a C program to do so. Thing is, if I have multiple clients, it's very likely I'll get multiple versions of the same request. It would be a nice optimization to somehow handle for that explicitly, by implementing a cache with something of a lock on a particular key, so that other clients will simply wait on that request coming back, and copy its response.

But I'm new to node, so I don't know how to rig this into my basic node router-request handler mechanism. Obviously I could do it in language x using basic concurrency primitives, but I know that node is event-oriented and I think this could be done quite elegantly in an evented way. Ideas?

like image 899
William Avatar asked Oct 08 '11 01:10

William


People also ask

How does NodeJS handle multiple requests?

How NodeJS handle multiple client requests? NodeJS receives multiple client requests and places them into EventQueue. NodeJS is built with the concept of event-driven architecture. NodeJS has its own EventLoop which is an infinite loop that receives requests and processes them.

How many requests can node handle at the same time?

Node.js in Heroku with AWS RDS js instance was capable to handle 31K requests in 60 seconds, which means an average of 515 requests per second.


1 Answers

Several answers above, but none really treating paralel requests to the same resource correctly.

You don't need to worry about concurrency when checking for cache key, since node is a single-threaded environment. All your actions are indeed atomic. All async operations in node will however cause it to accept further requests. Thus you need to handle concurrent overlapping requests, here solved with registering observers to an EventEmmiter:

var http = require('http'), EventEmitter = require('events').EventEmitter;
var cache = {};

http.createServer(function (req, res) {
   var key = someMagic(req), cached = cache[key]; // get some unique request identifier

   if (!cached) { // if we've never seen this request before
     cached = new EventEmitter(); // make this cache entry an event emitter
     cached.status = 'running';
     handleAsyncRequest(function(result) { // your request handling is probably asynchronous, call this callback when you're done
       cached.response = result; // memoize data
       cached.status = 'finished';
       cached.emit('finished'); // notify all observers waiting for this request
     });

   } else {
     switch(cached.status) { // if existing request, check if it's still running or finished
       case 'finished':
         res.end(cached.response); // send cached response immediately if request has finished
         break;
       case 'running':
         // subscribe as observer; send response when request is finished
         cached.once('finished', function() { res.end(cached.response); });
         break;
     }
   }
}).listen(1337, "127.0.0.1");
like image 174
zzen Avatar answered Oct 11 '22 16:10

zzen