I am trying to read multiple JSON files simultaneously and create a single array using the data available in the files and do some processing with the created data array in the Node.js server.
I would like to read these files and do the processing tasks simultaneously using web workers.
I read a few interesting tutorials and articles about the subject, but no one clearly explains how to process simultaneous tasks using web workers.
They talk about running a single separated task from the main thread. But I need to do multiple tasks at once.
I also know that creating multiple workers is not recommended according to the documentation of Node.js.
Maybe I have a misunderstanding of how the web worker is functioning or with the implementation in order to perform multiple tasks.
I also tried this great library Thread.js - https://threads.js.org/ still the documentation is unclear about running multiple tasks.
Can anyone please explain what is the way of implementing this kind of work with best practice along with the pros and cons?
I would prefer implementing the vanilla JS solution other than using a library so the explanation would also be a reference to readers.
Also if possible someone can explain the usage of the Thread.js library as well for future reference.
Thank you very much.
A web worker is a JavaScript program running on a different thread, in parallel with main thread. The browser creates one thread per tab. The main thread can spawn an unlimited number of web workers, until the user's system resources are fully consumed.
Our Node. js applications are only sort of single-threaded, in reality. We can run things in parallel, but we don't create threads or sync them.
Web workers let you write true multi-threaded JavaScript, meaning different bits of your code can be running at the same time. Without web workers, all code runs on the UI thread. Even things that seem multi-threaded, like ajax callbacks, setTimeout and setInterval , are actually single threaded.
As I'm sure you have read, the node is single-threaded, so running transactions in parallel is not going to work, even with worker threads as they are not designed to run in parallel.
A worker thread is more for longer, more process intense functions that you want to pass off and not block the main event loop, so if you think of it in terms of uploading and processing an image.. well we don't really want to hang up the entire event loop while the image is processed, so we can pass it off to a worker thread and it will tell the event loop when it's done, and it will return the response.
I think what you may be looking to do is just create a promise, so you would have a promise and say an array of the JSON file name like ["file1.JSON", "file2.JSON"] Then in your promise you would loop over, read the contents and 'return' the JSON object, insert or concat the main array variable.
Once the promise resolves, you would use the
.then(()=>{ //Do you processing of the full array })
Here's an example with a library (node-worker-threads-pool).
Thread/worker management is a complex endeavor, and I would not recommend trying to have some generic solution. Even the library I'm suggesting may not be correct.
// sample.js
const { StaticPool } = require('node-worker-threads-pool');
const start = async function () {
const staticPool = new StaticPool({
size: 4,
task: async function(n) {
const sleep = async function (ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
console.log(`thread ${n} started`);
await sleep(1000 * n);
return n + 1
}
});
// start 4 workers, each will run asynchronously and take a longer time to finish
for (let index = 0; index < 4; index++) {
staticPool.exec(index)
.then((result) => {
console.log(`result from thread pool for thread ${index}: ${result}`);
})
.catch((err) => console.error(`Error: ${err}`));
}
}
start();
I ran this in npm using node sample.js
As discussed in the other answer, it may not be useful (in terms of performance) to do this, but this example shows how it can be done.
The library also has examples where you give the tasks specific work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With