Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

NodeJS JSON.stringify() bottleneck

My service returns responses of very large JSON objects - around 60MB. After some profiling I have found that it spends almost all of the time doing the JSON.stringify() call which is used to convert to string and send it as a response. I have tried custom implementations of stringify and they are even slower.

This is quite a bottleneck for my service. I want to be able to handle as many requests per second as possible - currently 1 request takes 700ms.

My questions are:
1) Can I optimize the sending of response part? Is there a more effective way than stringify-ing the object and sending the response?

2) Will using async module and performing the JSON.stringify() in a separate thread improve overall the number of requests/second(given that over 90% of the time is spent at that call)?

like image 936
gop Avatar asked Feb 26 '14 16:02

gop


People also ask

Does JSON Stringify have a limit?

So as it stands, a single node process can keep no more than 1.9 GB of JavaScript code, objects, strings, etc combined. That means the maximum length of a string is under 1.9 GB. You can get around this by using Buffer s, which store data outside of the V8 heap (but still in your process's heap).

Why are we using the JSON Stringify () method?

The JSON. stringify() method in Javascript is used to create a JSON string out of it. While developing an application using JavaScript, many times it is needed to serialize the data to strings for storing the data into a database or for sending the data to an API or web server.

Can JSON Stringify throw an error?

Errors and Edge CasesJSON. stringify() throws an error when it detects a cyclical object. In other words, if an object obj has a property whose value is obj , JSON. stringify() will throw an error.

What can I use instead of Stringify?

You should use the library json2. js .


2 Answers

You've got two options:

1) find a JSON module that will allow you to stream the stringify operation, and process it in chunks. I don't know if such a module is out there, if it's not you'd have to build it. EDIT: Thanks to Reinard Mavronicolas for pointing out JSONStream in the comments. I've actually had it on my back burner to look for something like this, for a different use case.

2) async does not use threads. You'd need to use cluster or some other actual threading module to drop the processing into a separate thread. The caveat here is that you're still processing a large amount of data, you're gaining bandwidth using threads but depending on your traffic you still may hit a limit.

like image 147
Jason Avatar answered Oct 22 '22 10:10

Jason


After some year, this question has a new answer for the first question: yieldable-json lib. As described by in this talk by Gireesh Punathil (IBM India), this lib can evaluate a JSON of 60MB without blocking the event loop of node.js let you accept new requests in order to upgrade your throughput.

For the second one, with node.js 11 in the experimental phase, you can use the worker thread in order to increase your web server throughput.

like image 38
Manuel Spigolon Avatar answered Oct 22 '22 10:10

Manuel Spigolon