I'm trying to figure out a clean way for using web workers in order to perform api calls, so far I have this going in client side:
worker-client.js
export const workerFetch = (method = "get", url = "/", data = {}) =>
new Promise((resolve, reject) => {
const listener = `worker_fetch_${url}_${new Date()}`;
window.lootWorker.postMessage({
cmd: "worker_fetch",
payload: { method, url, data, listener }
});
window.addEventListener(
"message",
({ data }) => {
if (data.cmd === listener) {
if (data.success) resolve(data.data);
else reject(data.data);
}
},
false
);
});
so this function can be called from the client and given some parameters. inside web worker itself
worker-thread.js
const workerFetch = ({ method = "get", url = "", data = {}, listener = "" }) => {
fetch({ method, url, data })
.then(({ data }) => {
self.postMessage({ cmd: `worker_fetch_${listener}`, success: true, data });
})
.catch(error => {
self.postMessage({ cmd: `worker_fetch_${listener}`, success: false, data: error });
});
};
self.addEventListener(
"message",
({ data }) => {
switch (data.cmd) {
case "worker_fetch":
workerFetch(data.payload);
break;
default:
return null;
}
},
false
);
In theory this works, but I am concerned about bubbling up those event listeneres in client side, hence this question to see if there is a common way of achieving this functionality that focuses on performance? As after all we are doing this to offload some things from the main thread.
I'd have a few reservations about this pattern:
Remember that only the processing of the response is happening on another thread so you'd need to be sure processing your payload is a bottleneck before offloading all fetch requests.
If you are concerned about bubbling, you should set useCapture
in addEventListener
to true
and use stopPropagation
to prevent bubbling to children
You are assuming that all fetch calls return in sequence. Now that you're offloading to a worker this can no longer be guaranteed. So you'd need to tag each request with some sort of identifier.
How you implement #3 would be up for debate. If you are focusing on performance, you might want to research some performance UUID generators.
And finally, there appears to be a npm package achieving this functionality: fetch-worker
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With