I am doing "heavy" canvas operations in a jQuery each loop causing slower devices (IE and the iPad) to sometimes become totally unresponsive.
So I was thinking I could use underscore's _.defer()
to queue the functions in my each loop like:
function handleAsset = _.defer(function(){
//weightlifting goes here (partly async)
});
$.each(assets, handleAsset);
Yet this throws a weird error (the stack trace points to the $.each
):
Uncaught TypeError: Object 20877 has no method 'call'
Is this approach flawed? Is this due to async operations going on inside the handler function? Is there another / a better way to achieve this?
To break a $. each or $(selector). each loop, you have to return false in the loop callback. Returning true skips to the next iteration, equivalent to a continue in a normal loop.
The jQuery. Deferred method can be passed an optional function, which is called just before the method returns and is passed the new deferred object as both the this object and as the first argument to the function. The called function can attach callbacks using deferred.
each() loop and a $. each() loop at a particular iteration by making the callback function return false . Returning non-false is the same as a continue statement in a for loop; it will skip immediately to the next iteration. Note that $(selector).
continue labelname; The continue statement (with or without a label reference) can only be used to skip one loop iteration. The break statement, without a label reference, can only be used to jump out of a loop or a switch.
It is flawed. You should try to decouple / break up code at the lowest point possible. I think its unlikely that just decoupling each iteration of a loop is enough on the long run.
However, what you really need to do is, to setup an asyncronous runaway timer which gives the implementation enough room to update the UI Queue (or UI thread). This typically is done using methods like setTimeout()
(client), nextTick
(node.js) or setImmediate
(coming soon).
For instance, lets say we have an array, and we want to process each entry
var data = new Array(10000).join( 'data-' ).split('-'); // create 10.000 entries
function process( elem ) {
// assume heavy operations
elem.charAt(1) + elem.charAt(2);
}
for(var i = 0, len = data.length; i < len; i++ ) {
process( data[i] );
}
Now this code is a classic loop, iterating over the array and process its data. It'll also consume 100% CPU time and will therefore block the browsers UI queue as long as it takes to process all entries (which basically means, the browser UI will freeze and become unresponsive).
To avoid that, we could create a construct like this:
var data = new Array(10000).join( 'data-' ).split('-'); // create 10.000 entries
function runAsync( data ) {
var start = Date.now();
do {
process( data.shift() );
} while( data.length && Date.now() - start > 100 );
if( data.length ) {
setTimeout( runAsync.bind( null, data ), 100 );
}
}
runAsync( data.concat() );
What happens here ?
What we're basically doing is:
setTimeout
) and give the UI a chance to updateAny delay above 100 ms is typically recognized by the human eyes as "lag". Anything below that seems fluently and nice (at least our eyes will tell us so). 100ms is a good value as limit for maximum processing times. I'd even suggest to go down to 50ms.
The caveat here is that the overall processing time will increase, but I think its a better deal to have longer processing and stay responsive, instead faster processing and a very bad user experience.
Quick Demo:
So you want to limit the number of concurrent asynchronous operations? The flaw in your implementation is that you will be deferring each action until the previous one has completed.
One option is to use a sequence helper, you could then break this queue up into more manageable chunks for processing.
https://github.com/michiel/asynchelper-js/blob/master/lib/sequencer.js
var actions = [];
$.each(assets, function(key, value) {
actions.push(function(callback) {
$.ajax({
url: 'process.php?id='+val,
success: function(msg) {
callback();
}
});
});
}
);
var sequencer = new Sequencer(actions);
sequencer.start();
If you split your actions array into two arrays, and have them run side by side you would then only ever have two processes running at a time until both queues have completed.
e.g.
var arr1 = actions.splice(0,100);
var arr2 = actions.splice(100,200);
var sequencer1 = new Sequencer(arr1);
sequencer1.start();
var sequencer2 = new Sequencer(arr2);
sequencer2.start();
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With