Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

parsing large xml 500M with node.js

Tags:

node.js

xml

sax

I am using isaacs' SAX to parse a huge xml file. Also recommended by La Gentz.

The process uses about 650M of memory, how can I reduce this or allow node to use even more.

FATAL ERROR: CALL_AND_RETRY_0 Allocation failed - process out of memory

My XML file is larger than 300M it could grow to 1GB.

like image 705
vik Avatar asked Dec 28 '22 08:12

vik


1 Answers

You should stream the file into the parser, that's the whole point of a streaming parser after all.

var parser = require('sax').createStream(strict, options);
fs.createReadStream(file).pipe(parser);
like image 115
fent Avatar answered Jan 09 '23 13:01

fent