Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Node.JS Garbage Collection running for an hour?

Tags:

node.js

I had a previous question with a bounty here:

NodeJS application with memory leak, where is it?

Looks as if it was due to the max memory size of the VPS, now after increasing the VPS memory size to 4GB, the node JS consumes 3.x GB when GC seems to kick in. It then takes ~ 1 hour to GC before node becomes responsive again, at least that's what it seems looking at the server monitoring tool:

Free memory reaches 0, then for ~60 Minutes a process runs (CPU load shoots up) after which Node.JS application sends out data again.

Is such a long garbage collection process "normal"? Am I missing something?

Here some graphs to illustrate it: Graph 1: CPU Load 1 Min, Graph 2: Network traffic in Mbps, Graph 3: CPU utlization

enter image description here

For those who haven't followed the link above, this issue is regarding a Node application that uses Pub/Sub with Redis to receive messages that are then sent out to all connected clients.

I have commented out the "sending to clients" and the memory increase drastically slows down, making me believe that this could be partially the reason, here is the code of that part:

nUseDelay=1;

....

....

if(nUseDelay>0) {

    setInterval(function() {

        Object.getOwnPropertyNames(ablv_last_message).forEach(function(val, idx, array) {

            io.sockets.emit('ablv', ablv_last_message[val]);

        });

        ablv_last_message= {};

    }, 15000*nUseDelay);

}

If I comment out:

        // Object.getOwnPropertyNames(ablv_last_message).forEach(function(val, idx, array) {

            // io.sockets.emit('ablv', ablv_last_message[val]);

        // });

the memory increase seems to be very very slow. Why would this be the reason? Is this a so called "Closure" and how would that be ideally recoded if so?

Here the full code, this isnt a very complicated piece of work, it looks to me more like the standard framework for any such case where a Node.JS application is sending out the information of a central application to all its connected clients:

var nVersion="01.05.00";



var nClients=0;

var nUseDelay=1;



var ablv_last_message = [];



// Production

var https = require('https');

var nPort = 6000;               // Port of the Redis Server

var nHost = "123.123.123.123";  // Host that is running the Redis Server

var sPass = "NOT GONNA TELL YA";



var fs = require('fs');

var socketio = require('socket.io');

var redis = require('redis');



//  The server options

var svrPort = 443; // This is the port of service

var svrOptions = {

    key: fs.readFileSync('/etc/ssl/private/key.key'),

    cert: fs.readFileSync('/etc/ssl/private/crt.crt'),

    ca: fs.readFileSync( '/etc/ssl/private/cabundle.crt')

};



// Create a Basic server and response

var servidor = https.createServer( svrOptions , function( req , res ){

  res.writeHead(200);

  res.end('Hi!');

  });



// Create the Socket.io Server over the HTTPS Server

io = socketio.listen( servidor );



// Now listen in the specified Port

servidor.listen( svrPort );



console.log("Listening for REDIS on " + nHost + ":" + nPort);



io.enable('browser client minification');  // send minified client

io.enable('browser client etag');          // apply etag caching logic based on version number

io.enable('browser client gzip');          // gzip the file

io.set('log level', 1);                    // reduce logging



io.set('transports', [

    'websocket'

  , 'flashsocket'

  , 'htmlfile'

  , 'xhr-polling'

  , 'jsonp-polling'

]);



cli_sub = redis.createClient(nPort,nHost);



if(sPass != "") {

  cli_sub.auth(sPass, function() {console.log("Connected!");});

}



cli_sub.subscribe("vcx_ablv");



console.log ("Completed to initialize the server. Listening to messages.");



io.sockets.on('connection', function (socket) {

  nClients++;

  console.log("Number of clients connected " + nClients);

  socket.on('disconnect', function () {

    nClients--;

    console.log("Number of clients remaining " + nClients);

  });

});



cli_sub.on("message",function(channel,message) {

    var oo = JSON.parse(message);

    ablv_last_message[oo[0]["base"]+"_"+oo[0]["alt"]] = message;

});



if(nUseDelay>0) {

    var jj= setInterval(function() {

        Object.getOwnPropertyNames(ablv_last_message).forEach(function(val, idx, array) {

            io.sockets.emit('ablv', ablv_last_message[val]);

        });

        ablv_last_message= {};

    }, 5000*nUseDelay);

}

And here the heapdump analysis after running the application for a couple of Minutes:

enter image description here

I thought I'ld bump this question, as no satisfying answer was yet given.

By the way, I put NGINX infront of the Node.JS application and all memory issues are gone, the Node applicatin now settles at around 500MB - 1GB.

like image 304
KKK Avatar asked Dec 02 '13 15:12

KKK


Video Answer


1 Answers

We were recently having the same issue.

Socket.io v0.9.16 automatically opens 5 channels per connection, and has a very hard time closing them. We had 18 servers live that were constantly gaining in memory until it would freeze, and restart the servers.

By updating to Socket.io v0.9.17, the issue went away.

We spent a week or three looking through every line of code to find the culprit.

like image 89
Brian Noah Avatar answered Sep 19 '22 09:09

Brian Noah