I'm building an emoji picker and the most demanding task is creating ~1500 DOM elements for each emoji which blocks/makes the page unresponsive for about 500-700ms.
I've debugged this and it seems like the creation of DOM elements is what blocks the rest of JS execution:
function load_emojis(arr){
var $emojis = [];
# arr.length = 1500
_.each(arr, function(){
$emojis.push($('<span/>').append($('<img/>')));
});
$('.emojis').html($emojis);
}
is there a way to execute this whole thing asynchronously/in another thread so it doesn't block the JS following it?
I've tried to put it inside setTimeout but that still seems to be executed in the same thread thus still blocking JS execution.
JavaScript isn't threaded; it's like most UI libraries in that it does everything in a single thread with an event loop; async behaviors may do work in background threads (invisible to the JS programmer; they don't explicitly manage or even see threads), but the results are always delivered to the single foreground thread for processing.
Element rendering isn't actually done until your JS code finishes, and control returns to the event loop; if you're rendering too many things, the delay occurs when the browser needs to draw it, and there is not much you can do to help. The most you could conceivably do is reduce the parsing overhead by explicitly creating elements rather than passing a wad of text for parsing, but even so, your options are limited.
Things that might help, depending on browser, phase of moon, etc., include:
cloneNode(True)
to copy that template, and fill in the small differences after; this avoids the work of parsing X node trees when it's really the same node tree repeated a number of times with a few attribute tweaks.setTimeout
/setInterval
window between them that any given insertion doesn't take very long, and the UI stays responsive).A big one for "array of many images" is to stop using 1500+ images, and instead use a single monolithic image. You then either:
a. Use the monolithic image at a given fixed offset and size repeatedly via CSS (image is decompressed and rendered once, and views of that image are mapped at different offsets repeatedly or...
b. Use the map
/area
tags to insert the image only once, but make clicks behave differently on each part of the image (reduces the DOM layout work to a single image; all the other DOM elements must exist in the tree, but don't need to be rendered)
Here is a function that will divide the work into chunks:
function load_emojis(arr){
var chunk = 20;
$('.emojis').html(''); // clear before start, just to be sure
(function loop(i) {
if (i >= arr.length) return; // all done
var $emojis = [];
$.each(arr.slice(i, i+chunk), function(){
$emojis.push($('<span/>').append($('<img/>')));
});
$('.emojis').append($emojis);
setTimeout(loop.bind(null, i+chunk));
})(0);
}
This will do a setTimeout
for every 20 next items of your array.
Obviously the total time to complete will be longer, but other JS and user events can happen during the many little pauses.
I left out the second argument of setTimeout
since the default (0) is enough to yield to other tasks in the event queue.
Also, I found your use of html()
a bit odd, since the documentation allows the argument to be a string or a function, but you provided it an array of jQuery elements... In that case append()
would be the function to use.
Play with the chunk
size so to find the ideal size. It probably would be bigger than just 20.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With