This may be a really silly question. I admit I'm a bit naive to the chrome web engine, and JS v8 engine's capabilities.
But say, I'm running a puppeteer application which is scraping URL's from tags, and pushing them to an array called img2arr
Then, I have a local file var img1 = ./image.jpg (in quotes).
Last, I have a function compare(img1, img2arr) which takes in both of these as arguments, and using a library such as blink-diff, or Jimp, analyzes and compares img1 against each image in img2arr. This all happens in a .forEach() or .map loop which works, but can be slow as the image2arr grows.
Say it contains 500 image URL's - is there a way to use service workers, a specific Node.js library, or anything to ensure that my image looping and comparing logic all happen across multiple threads?
For instance 200 loops, comparing two 12KB images takes 7 seconds, but with my blazing fast 12 cores processor couldn't it take less than 1?
Sorry for my obvious naivety!
There is a few possible ways:
1. Use spawn to run separately your script on every core of your machine. Here is really nice article.
2. Use node.js worker threads here is you can find how it works with examples.
Not depending the way of implementation you decide, main part would be to collect all data, up to this point
Last, I have a function compare(img1, img2arr) which takes ....
Then, separate your array of img2arr to chunks. You can choose any method from this article
After separation, pass each chunk to separate process and wait for first process to find the similar image.
When the process finds out similar image, you can kill all other processes from your master process.
So, the full process would be:
1. Collect images to compare
2. Split images to compare into chunks.
3. Separate comparison logic to separate file and run a thread/process on this file.
3a. Send chunks to available process
3b. Wait for first success return from process and kill all other ones.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With