Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to "free" memory Buffer from Garbage Collector in Node.js?

I want to know how I can destroy a Buffer in order to free memory.
I have the code below, it creates a Buffer and send this one as response. This works fine but when I play with a big array like with 75 000 rows, I can see my memory takes over 1Go, it's ok but when the response is sent, this memory is kept and not free... I try to set var buffer to null at the end of the script but nothing append... Is there a solution to free this memory ?

var xlsxexport = require('node-xlsx');

module.exports = {
    exportExcel: function (req, res) {

        var excelData = []
        // ...
        // Construction of the array excelData
        // ...

        var buffer = xlsxexport.build([{name:'export', data:excelData}])
        res.set 'Content-Type', 'application/vnd.openxmlformats'
        res.set 'Content-Disposition', 'attachment; filename=' + filename
        res.send buffer
  }
}
like image 700
Zagonine Avatar asked Dec 07 '16 09:12

Zagonine


People also ask

How do I stop memory leaks in Node js?

Avoid Accidental Globals This could be the result of a typo and could lead to a memory leak. Another way could be when assigning a variable to this within a function in the global scope. To avoid issues like this, always write JavaScript in strict mode using the 'use strict'; annotation at the top of your JS file.

How do I increase the memory limit in Node js?

If you want to increase the max memory for Node you can use --max_old_space_size option. You should set this with NODE_OPTIONS environment variable.

What is buffer Alloc in Node js?

The Buffer. alloc() method creates a new buffer object of the specified size.


2 Answers

If you are really confident what your code has no leaks, and you already set every big variable to null, then you can try to start GC.

Add option --expose-gc when you are running you script:

node --expose-gc index.js

And whenever you want inside of you scripts u can call GC:

global.gc()

But I strongly recommend you to find some ways to do it without forcing GC.

Good luck!

like image 82
JerryCauser Avatar answered Oct 08 '22 18:10

JerryCauser


First, the module that you are using seems to cause a lot of overhead for 75k rows, wouldn't it be better to use CSV format instead?

If you do want to continue with your approach, there some V8 options that can restrict some memory limits. Restricting the memory limits could cause more time wasted on doing garbage collecting, so be careful with over optimising.

Here is good place to start with: Limit Node.js memory usage to less than 300MB per process

like image 31
Risto Novik Avatar answered Oct 08 '22 17:10

Risto Novik