Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to store 100GB in memory in Node.js

Say you have a machine that has 200GB+ RAM, is it possible to store that much data inside an application?

I'm running a test that creates a nested object like the following:

{
  id1: {
    val1: 123,
    val2: 456,
    v...
  },
  id2: {
    val1: 234,
    val2: 567,
    ...
  }
  ...
}

I'll run it using --max-old-space-size=200000 and it runs fine until the object is about 50GB in size, then crashes with error: FATAL ERROR: NewSpace::Rebalance Allocation failed - process out of memory every time.

I've tried manually forcing garbage collection, separating it into smaller objects, etc., with no luck.

like image 929
lifwanian Avatar asked Jul 20 '18 17:07

lifwanian


People also ask

How do I increase node JS memory limit?

If you want to increase the max memory for Node you can use --max_old_space_size option. You should set this with NODE_OPTIONS environment variable.

How much memory can Nodejs use?

In Node < 12 , it sets a limit of 1.5 GB for long-lived objects by default. If this exceeds the memory available to your dyno, Node could allow your application to start paging memory to disk.

How much memory should a Node app use?

Theoretically, a 64-bit process should be able to allocate more than 4GB and grow comfortably well into 16 terabytes of address space.


2 Answers

You don't. Use streams. The following I feel is a pretty darn good write-up of what a stream is and how to use them. I refer to it from time to time: https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93

This will take your memory footprint from "200GB" for some single given object, to likely less than a few MB of memory usage.

like image 89
dvsoukup Avatar answered Sep 17 '22 04:09

dvsoukup


Node doesn't do well with very large invocations, I'm surprised you're getting all the way to 50GB, I usually see crashes around 4-8GB when trying to load too much.


If you're working on a pipeline[1], use line-separated JSON if you're inputting/exporting to a file: (\n indicates end of line marker)

{"id":"id01',"value":{...}}\n
{"id":"id01',"value":{...}}\n

You can from this read/write as a stream, using already available read by row/line filters for your pipeline.


Alternatively, if you need this data interactively, probably best to use a local database like Redis.

[1] https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93

like image 30
Tracker1 Avatar answered Sep 19 '22 04:09

Tracker1