I am working in an embedded environment, where resources are quite limited. We are trying to use node.js, which works well, but typically consumes about 60 megabytes of virtual memory (real memory used is about 5 megabytes.) Given our constraints, this is too much virtual memory; we can only afford to allow node.js to use about 30 megabytes of VM, at most.
There are several command-line options for node.js, such as "--max_old_space_size", "--max_executable_size", and "--max_new_space_size", but after experimentation, I find that these all control real memory usage, not maximum virtual memory size.
If it matters, I am working in a ubuntu linux variant on an ARM architecture.
Is there any option or setting that will allow one to set the maximum amount of virtual memory that a node.js process is allowed to use?
The limit can be raised by setting --max_old_space_size to a maximum of ~1024 (~1 GB) (32-bit) and ~1741 (~1.7 GiB) (64-bit), but it is recommended that you split your single process into several workers if you are hitting memory limits."
Tuning the Garbage Collector In Node < 12 , it sets a limit of 1.5 GB for long-lived objects by default. If this exceeds the memory available to your dyno, Node could allow your application to start paging memory to disk.
memoryUsage() method is an inbuilt method of the process module that provides information about the current processes or runtime of a Node. js program. The memory usage method returns an object describing the memory usage in bytes of the Node.
rss , or resident set size, refers to the amount of space occupied in the main memory for the process, which includes code segment, heap, and stack.
You can use softlimit to execute the node with limited size. Or you can directly use setrlimit of Linux, but not really sure how to call it from NodeJS, see this SO question
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With