Background Setup
I have a web application that deals with creating images from a set of other images. The way I've chosen to do that is by reading in a set of images and placing them on an HTML canvas. Then, I export each canvas as a jpeg to a third-party API using toDataURL
and converting it into a Blob. The issue I am facing is that I have many of these canvases all exporting data as a jpg and it is consuming a lot of resources. The application slows and becomes unresponsive as each canvas attempts to call toDataURL
.
Question
I've found that calling a canvas's toDataUrl()
or toBlob()
can be very slow especially for large canvas sizes. I'd like to utilize the multi-threaded nature of web workers.
First, I tried passing in the canvas object but an error was thrown. It turns out objects are a problem and that it seems they either get converted into strings or fail when they can't be cloned. Either way, I found that passing a context's image data does work. The data is passed in the form of raw RGB values as a Uint8ClampedArray
from the canvas context's method getImageData()
.
Main.js
var canvas = document.createElement('canvas');
var context = canvas.getContext('2d');
var worker = new Worker('myWorker.js');
worker.postMessage({
image: context.getImageData(0, 0, canvas.width, canvas.height)
});
myWorker.js
this.onmessage = function(e) {
// GOAL: turn e.data.image into an image blob or dataUrl and return it.
// e.g. this.postMessage(new Blob([e.data.image.data], {type: 'image/jpg'});
}
I think it comes down to knowing how to convert a Uint8ClampedArray
which holds the RGB information into the jpg/png data.
The reason why I think this might be useful is that I believe that getImageData
just copies an existing data structure from the canvas context and therefore is not as costly as toDataUrl
. I captured the cpu profile while calling something similar to the code block below:
var image = context.getImageData(0, 0, canvas.width, canvas.height)
var dataUrl = canvas.toDataURL('image/jpeg');
and got:
So, given that, I'd like to offload the brunt of the process into a web worker. I don't even mind if it takes longer inside the web worker as long as its happening in another process.
Couple of extra thoughts about it:
encoderOptions
, the second option in toDataURL
, as a way to speed up the process, but I haven't seen much of a change----UPDATE----
I thought I would share my solution as an npm library: https://www.npmjs.com/package/jpeg-web-worker. It explains how to harness the provided web-worker to do the heavy lifting for you.
---------------------
I got a solution that works for me, speeding up the application and the responsiveness of the page while still generating the new images.
Here is the app code:
App
var canvas = $('#myCanvas')[0];
var context = canvas.getContext('2d');
var imageData = context.getImageData(0, 0, canvas.width, canvas.height);
var worker = new Worker('myWorker.js');
worker.postMessage({
image: imageData
});
worker.onmessage = function(e) {
var blob = new Blob( [e.data.data], {type: 'image/jpeg'} );
// use blob
}
And here is the worker code:
Worker
this.onmessage = function(e) {
var jpgInfo = encode(e.data.image, 50);
this.postMessage(jpgInfo);
}
function encode() { ... } // ported from jpeg-js
Obviously, the bulk of this answer comes from the encode
function. This function was modified from the npm module jpeg-js, and more specifically, the file encoder.js. I ported the encode function by copying the entire encoder.js file into my myWorker.js. It's not tiny but it's also very self contained which made it easy. The only issue I had left was modifying the code so that it worked outside of the node.js environment it was built for.
This turned out to be relatively easy:
Buffer
. This was a two step process. First, remove the atob definition (as it's not needed) at the top. Second, return a new Unit8Array
at the end of the this.encode function. The current version actually has this commented out right above the buffer reference. Just use that one and remove everything underneath.I don't have exact timing measurements but it went from ~10 seconds of lag time as the images were generated to under a second of lag time. I'm using "lag time" here to mean sluggish performance while using the page.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With