Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Iterate through an array in blocks of 50 items at a time in node.js

I'm new to node.js and am currently trying to code array iterations. I have an array of 1,000 items - which I'd like to iterate through in blocks of 50 items at a time due to problems with server load.

I currently use a forEach loop as seen below (which I'm looking at hopefully transforming into the aforementioned block iteration)

   //result is the array of 1000 items

   result.forEach(function (item) {
     //Do some data parsing
     //And upload data to server
    });

Any help would be much appreciated!

UPDATE (in reponse to reply)

async function uploadData(dataArray) {
    try {
        const chunks = chunkArray(dataArray, 50);
        for (const chunk of chunks) {
            await uploadDataChunk(chunk);
        }
    } catch (error) {
        console.log(error)
        // Catch en error here
    }
}

function uploadDataChunk(chunk) {
    return Promise.all(
        chunk.map((item) => {
            return new Promise((resolve, reject) => {
               //upload code
                }
            })
        })
    )
}
like image 710
Hendies Avatar asked Oct 08 '17 14:10

Hendies


2 Answers

You should firstly split your array to chunks of 50. Then you need to make requests one by one, not at once. Promises can be used for this purpose.

Consider this implementation:

function parseData() { } // returns an array of 1000 items

async function uploadData(dataArray) {
  try {
    const chunks = chunkArray(dataArray, 50);
    for(const chunk of chunks) {
      await uploadDataChunk(chunk);
    }
  } catch(error) {
    // Catch an error here
  }
}

function uploadDataChunk(chunk) {
  // return a promise of chunk uploading result
}

const dataArray = parseData();
uploadData(dataArray);

Using async/await will use promises under the hood, so that await will wait till current chunk is uploaded and only then will upload next one (if no error occurred).

And here is my proposal of chunkArray function implementation:

function chunkArray(array, chunkSize) {
  return Array.from(
    { length: Math.ceil(array.length / chunkSize) },
    (_, index) => array.slice(index * chunkSize, (index + 1) * chunkSize)   
  );
}

Note: this code uses ES6 features, so it it desirable to use babel / TypeScript.

Update

If you create multiple asynchronous database connections, just use some database pooling tool.

Update 2

If you want to update all the chunks asynchronously, and when chunk is uploaded start to upload another one, you can do it this way:

function uploadDataChunk(chunk) {
  return Promise.all(
    chunk.map(uploadItemToGoogleCloud) // uploadItemToGoogleCloud should return a promise
  );
}
like image 149
Yuriy Yakym Avatar answered Oct 24 '22 08:10

Yuriy Yakym


You may chunk your array in the required chunk size as follows;

function chunkArray(a,s){ // a: array to chunk, s: size of chunks
  return Array.from({length: Math.ceil(a.length / s)})
              .map((_,i) => Array.from({length: s})
                                 .map((_,j) => a[i*s+j]));
}

var arr = Array(53).fill().map((_,i) => i); // test array of 53 items
console.log(chunkArray(arr,5))              // chunks of 5 items.
.as-console-wrapper{
max-height: 100% ! important;
}
like image 39
Redu Avatar answered Oct 24 '22 07:10

Redu