Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reduce nodejs application memory using stream?

This is probably a newbie question but I searched and couldn't find a satisfying answer.

My node.js application seems to consume a lot of memory. Each process consumes about 100MB. I heard that nodejs itself has a ~30MB memory footprint per process.

The application is a JSON api, backed by MongoDB. In may cases, one API request will result in many database requests, mainly to populate the child relationships. A typical query is like this: (1) get an array of objectIds based on query condition, and (2) iterate each objectId, and issue a query to the database to populate the data (some call that hydration).

The code is heavily using async.js. I tried to profile the memory usage and it seems async.js is using a lot of memory but there is no sign of memory leak. The author of async.js also came out with a stream library highland.js (http://highlandjs.org/). I am new to nodejs stream, and I am curious if this is a possible tool to replace async.js? The web site seems to mention underscore but I mainly use async.js for the asynchronous processing.

Thanks!

like image 322
yichen Avatar asked Mar 28 '14 16:03

yichen


1 Answers

The tldr: yes, using streams can potentially reduce your memory footprint. When you process a stream, you process chunks of data at a time. The alternative is basically to load all the data beforehand into a String, Buffer, whatever, and then to process it.

However, you should note that 100MB is not large for a node process. Node/v8 assumes that you'll have about 1.5 GB to work with. A tiny app might be 128 MB, a small one 256, medium 512. 1 GB is a pretty large node process, and at the point you should probably split your app into smaller pieces:

  • https://github.com/joyent/node/wiki/FAQ#what-is-the-memory-limit-on-a-node-process
like image 170
hunterloftis Avatar answered Oct 30 '22 12:10

hunterloftis