Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

NodeJs performance problem

I'm building a realtime stats application using NodeJs. For the prototype I'm using a Quad-Core AMD Opteron in a RackSpace server for the test with a nodejs server using the Cluster NodeJs ( http://learnboost.github.com/cluster/ ) and the MongoDb using the native nodejs driver.

Basically I've inserted a JS code in my company project that delivers content for a bunch of client's websites. This code "pings" my server each 10seconds calling for a image and passing parameters that I get in the server-side and insert ( or update ) in a MongoDb collection. In a "slow" time of the day I get about 3000 connections ( I get these using the netstat -natp command on terminal) each time that makes my cluster use about 25% of each core ( I get these using the "top" command ). But in a "busy" hour I get about 7000+ connections each time what makes my cluster go crazy ( about 80%+ use of each core ), and it seems that as the time goes by, the node degrades. Is this normal? Or should Nodejs handle these hits in a more "easy" way? If I use Mongoose, the performance can increase?

In case you are curious about the MongoDb it uses about 4% of one core, which is fine by me ( without putting a index the use was about 50%+ but, at least, the index solved this performance problem ).

Thanks a lot for the patience, Cheers.

Edit:

The code that makes the insert looks like this: db.open(function(err, db) { });

return connect.router(function(app){
    app.get("/pingserver/:clientid/:event/:cachecontrol", function(req, res, next){
    event:'+req.params.event + ', cachecontrol:' + req.params.cachecontrol);
        var timestamp = new Date(); 
          switch(req.params.event) {
          case 'load':
              var params = url.parse(req.url, true).query;

              db.collection('clientsessions', function(err, collection)         {
                try {

                    var client = {
                        id: req.params.clientid,
                        state: req.params.event + 'ed',
                        loadTime: timestamp.getTime(),
                        lastEvent: req.params.event,
                        lastEventTime: timestamp.getTime(),
                        lastEventDate: timestamp.toString(),
                        events: [{
                            event: req.params.event,
                            timestamp: timestamp.getTime(),
                            date: timestamp.toString()
                        }],
                        media: {
                            id: params.media.split('|')[0] || null,
                            title: unescape(params.media.split('|')[1]) || null
                        },
                        project: {
                            id: params.project.split('|')[0] || null,
                            name: unescape(params.project.split('|')[1]) || null
                        },
                        origin: req.headers['referer'] || req.headers['referrer'] || '',
                        userAgent: req.headers['user-agent'] || null,
                        userIp: req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress)),
                        returningUser: false
                    };
                }catch(e) {console.log(e);}       
                 collection.insert(client, function(err, doc) {
                 });
              });
              break;

          case 'ping':
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : { lastEvent: req.params.event 
                                                             ,lastEventTime: timestamp.getTime(),lastEventDate: timestamp.toString()}
                                                   }, {}, function(err, doc) {});
              });
              break;

          default:
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : {state: req.params.event+'ed'
                                                            , lastEvent: req.params.event 
                                                            , lastEventTime: timestamp.getTime()}
                                                   , $push : { events : { event: req.params.event, timestamp: timestamp.getTime(), date: timestamp.toString() } } }, {}, function(err, doc) {});
              });

              break;
          }

          if (!transparent) {
              console.log('!transparent');
              transparent = fs.readFileSync(__dirname + '/../../public/images/transparent.gif', 'binary');
          }
          res.setHeader('Content-Type', 'image/gif');
          res.setHeader('Content-Length', transparent.length);

          res.end(transparent, 'binary');
      });
});
like image 284
Thiago Miranda de Oliveira Avatar asked May 06 '11 03:05

Thiago Miranda de Oliveira


1 Answers

Is this normal?

Depends, are connections going away on their own? Do they just keep building? Are you talking about "web connection" (http) or MongoDB connection?

What do the mongod logs say? What do the node logs say?

How many requests are you getting per second?

Or should Nodejs handle these hits in a more "easy" way?

Hard to say without knowing what the code is doing.

How many simultaneous connections do you expect the box to handle?

If I use Mongoose, the performance can increase?

So Mongoose is actually an object wrapper around node-mongodb-native driver. It is not a different driver, it's just a wrapper.

The wrapper going to add code to the code you already have. If you have a code problem, then adding code is not guaranteed to make the problem better. If mongoose does solve your problem, then it's doing something with connections that you're not. If that the case, you don't necessarily need Mongoose, you just need better connection management.


Look there are lots of potential sources for you issue.

The only way to get this solved is to break out the pieces and dig in with much more detail. Places to start: - are connections to MongoDB closing correctly (look at the db logs)? - do the logs contain any other errors? - do the same thing for the node logs? - do you have graphs regarding the memory usage? who's taking up the most memory? - when you get to 80% of each core, which process is doing this? mongod? node? something else?

To really help you out here, we need a lot more data about what's going on with the system.

like image 88
Gates VP Avatar answered Sep 26 '22 00:09

Gates VP