Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it normal for Node.js' RSS (Resident Set Size) to grow with each request, until reaching some cap?

Tags:

node.js

v8

I've noticed that RSS (Resident Set Size) of my node.js app is growing over time, and considering I'm having a "JS Object Allocation Failed - Out of Memory" error on my server, it seems a likely cause.

I set up the following very simple Node app:

var express = require('express');

var app = express();
app.get('/',function(req,res,next){
    res.end(JSON.stringify(process.memoryUsage()));
});
app.listen(8888);

By simply holding down the "refresh" hotkey @ http:// localhost:8888/ I can watch the RSS/heap/etc. grow, until RSS gets well above 50mb (before I get bored). Waiting a few minutes and coming back, the RSS drops - presumably the GC has run.

I'm trying to figure out if this explains why my actual node app is crashing... my production app quickly hits about 100Mb RSS size, when it crashes it is generally between 200Mb-300Mb. As best as I can tell, this should not be too big (node should be able to handle 1.7Gb or so, I believe), but nonetheless I'm concerned by the fact that the RSS size on my production server trends upwards (falloffs represent crashes):

enter image description here

like image 812
Zane Claes Avatar asked Dec 02 '12 04:12

Zane Claes


People also ask

What is RSS memory Nodejs?

rss , or resident set size, refers to the amount of space occupied in the main memory for the process, which includes code segment, heap, and stack.

How much space does node js take?

4 KB of 24-bit storage is required for each thread used by the Node. js runtime. The number of threads used is fixed once the Node. js runtime has started, and is typically between 8 and 12, unless you set the UV_THREADPOOL_SIZE environment variable.

In which area node JS is not recommended?

js receives a CPU bound task: Whenever a heavy request comes to the event loop, Node. js would set all the CPU available to process it first, and then answer other requests queued. That results in slow processing and overall delay in the event loop, which is why Node. js is not recommended for heavy computation.


1 Answers

This question is quite old already and yet has no answer, so I'll throw in mine, which references a blog post from 2013-2014 by Jay Conrod who has "worked on optimizing the V8 JavaScript engine for mobile phones".

V8 tries to be efficient when collecting garbage and for that it uses Incremental marking and lazy sweeping.

Basically incremental marking is responsible for tracking whether your objects can be collected.

Incremental marking begins when the heap reaches a certain threshold size.

Lazy sweeping is responsible for collecting the objects marked as garbage during incremental marking and performing other time consuming tasks.

Once incremental marking is complete, lazy sweeping begins. All objects have been marked live or dead, and the heap knows exactly how much memory memory could be freed by sweeping. All this memory doesn't necessarily have to be freed up right away though, and delaying the sweeping won't really hurt anything. So rather than sweeping all pages at the same time, the garbage collector sweeps pages on an as-needed basis until all pages have been swept. At that point, the garbage collection cycle is complete, and incremental marking is free to start again.

I think this explains why your server allocates so much memory until it reaches a certain cap. For a better understanding I recommend reading Jay Conrod's blog post "A tour of V8: Garbage Collection".

like image 94
borisdiakur Avatar answered Sep 30 '22 06:09

borisdiakur