Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Number of objects vs Payload, while scaling a modern Javascript project which is more important?

of course, less payload equals to less number of objects but please read the whole description below.

While scaling a javascript project on a modern browser which one is more important? The size of data payload or the number of javascript objects in the memory. I have a huge JSON string which I am looping around and chopping that huge string into different objects. The JSON string holds a lot of traveller information, and each Javascript object has a lot of properties. When there are more than 10,000 travellers in the JSON, the browser is struggling to perform.

I am bringing a lot of properties unnecessary if I can reduce the number of properties obviously my payload will decrease, still, the number of objects might stay the same.

A number of JS objects vs Smaller payload, which one gives more buck for the money regarding the performance?

Thanks

like image 669
Naren Avatar asked Aug 22 '17 16:08

Naren


1 Answers

I like some of the answers I've read, but I think I would approach the question of the bottle neck in a different way, by which I hope you could avoid future bottle necks as well.

Most answers assume that the number of objects is the bottleneck. I think this isn't the case. I believe the bottleneck is the fact that the JavaScript event loop gets excessively backlogged.

As you well know, JavaScript runs only a single thread and an event loop.

Each function you call is actually a callback for such events.

However, since there is only a single thread, any code in the web page has to wait for each event to complete before any other task can be performed.

This means that it's more important for JavaScript functions to be fragmented (into micro events/callbacks) than for any single function to be performance oriented.

In your case, you're both looping over a long string and performing actions - without returning the control to the event loop - which means the browser has to wait for this big chunk of code to complete before it can process any more data / events.

The question of data collection / processing can be argued over. Is it better to get a lot of small messages (placing, perhaps, more load on the network / server)? Is it better to receive a huge string and process it by chunks? ...

... I don't know, this really depends on other factors, such as the server's design, the database's design, the client load, update interval, etc'.

If you do prefer to process a single huge string, it might be better to process it a bit at a time and then forward it to a callback for future processing.

i.e. for a \n seperated JSON string, you could try:

function consumeString(s) {
    if(s.length == 0)
        return;
    var sep = s.indexOf("\n");
    if(sep < 0)
        sep = s.length;
    try {
        var obj = JSON.parse(s.slice(0, sep));
        console.log("processed:", obj);
    } catch {
        console.log("Failed... not valid JSON?:");
    }
    // schedule the next slice for later processing.
    setTimeout(consumeString, 0, s.slice(sep + 1));
}


var text = '{ "employees" : [' + // JSON1
'{ "firstName":"John 1" , "lastName":"Doe 1" },' +
'{ "firstName":"Anna 1" , "lastName":"Smith 1" },' +
'{ "firstName":"Peter 1" , "lastName":"Jones 1" } ]}' + // END JSON1
"\n" +
'{ "employees" : [' + // JSON2
'{ "firstName":"John 2" , "lastName":"Doe 2" },' +
'{ "firstName":"Anna 2" , "lastName":"Smith 2" },' +
'{ "firstName":"Peter 2" , "lastName":"Jones 2" } ]}';

consumeString(text);

This is just an outline, obviously, but although it seems less performant (it wastes time rescheduling itself and is constantly interrupted, increasing the chance for CPU cache misses)... it actually helps the browser remain responsive and improves the perceived performance, from a user's point of view.

like image 176
Myst Avatar answered Oct 12 '22 23:10

Myst