Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

DOM overload problems in Google Chrome

I have some problems when using a large number of hidden items in Google Chrome.

Recently, I posted a question that many thought was obscure and soon it was closed. I found the cause of this problem, but so far I have no idea how to solve it.

Sometimes when developing pages, a method is used which consists in the fact that some elements are created in advance and hidden, and then displayed if necessary.

So the number of such elements greatly affects the speed of the response of the browser. Suppose we have the following code:

var elem = document.getElementsByClassName ('Founder') [0];
var parent = document.getElementsByClassName ('Cloud') [0];
var empty = document.getElementsByClassName ('empty') [0];
for (var i = 0; i <50000; i ++) {
var clone = elem.cloneNode (true);
    // var clone = empty.cloneNode (true);
clone.style.display = 'none';
        parent.appendChild (clone);
    }
<div class = 'Cloud'>
<input class = 'Founder' type = 'text'>
<div class = 'empty'> </div>
</div>

So when I launch it in Firefox (67.0 (64-bit)), then there are no special brakes. But when I run it in Chrome Version 74.0.3729.169 (Official build), (64 bits), then I get strong brakes.

In the profile this can be seen as the Empty Task (System). Look at the screenshot. (It is from the old theme, and there are a total of 640,000 nodes, but this does not change the essence).

Performance tools showing slow task

Is there a way to speed up the work, and can I freeze items that are not displayed? As far as I understand these empty tasks, this is the time for which the browser indexes the element or something like that.

Maybe there are any settings that can be changed programmatically, which will speed up the work (may require more RAM).

like image 668
Sergei Illarionov Avatar asked Jun 06 '19 07:06

Sergei Illarionov


Video Answer


1 Answers

parent.appendChild(...); is the slow step. You're adding 50,000 nodes to the document, even hidden (which should avoid a reflow layout step) they will be a chunk of work to add.

Adding them to a DocumentFragment will help, but not that much if you want to quickly render 640,000 nodes.

Creating elements in advance and then showing them can be a useful way to avoid jank with user interactions, but large documents are generally slower and large numbers of nodes will be hard to make fast.

I think you have two options:

  1. Batch the changes into smaller pieces of work, and use requestAnimationFrame to wait for the next frame to do the next batch. That way will stop jank and the apparent second or so while the browser works on all the new DOM, but will take much more time overall. Alternatively you could use requestIdleCallback to build up the massive document in the background when other work isn't happening. With either of these you could have an issue where the document appears interactive but your hidden DOM nodes aren't available yet, so you have to manage that.

  2. Switch to adding the DOM as you need it, and instead optimise just that DOM. Create some other object to manage all your data (you could even use a Worker or Comlink to keep all that data off the main JS thread). This will generally be quicker and is what most apps do, but you will have more work optimising the performance in response to user actions (they might wait 8s for your app to load initially, but then if they click something they're going to expect something within 100ms or so).

like image 171
Keith Avatar answered Oct 06 '22 00:10

Keith