I have an array that contains an array of promises, and each inner array could have either 4k, 2k or 500 promises.
In total there are around 60k promises and I may test it with other values as well.
Now I need to execute the Promise.all(BigArray[0])
.
Once the first inner array is done, I need to execute the next Promise.all(BigArray[1])
and so on and so on.
If I try to execute a Promise.all(BigArray)
its throwing:
fatal error call_and_retry_2 allocation failed - process out of memory
I need to execute it each of promises in series, not in parallel which I think that’s what Node its doing. I shouldn't use new libs however am willing to consider the answer!.
Edit:
Here is an example piece of code:
function getInfoForEveryInnerArgument(InnerArray) {
const CPTPromises = _.map(InnerArray, (argument) => getDBInfo(argument));
return Promise.all(CPTPromises)
.then((results) => {
return doSomethingWithResults(results);
});
}
function mainFunction() {
BigArray = [[argument1, argument2, argument3, argument4], [argument5, argument6, argument7, argument8], ....];
//the summ of all arguments is over 60k...
const promiseArrayCombination = _.map(BigArray, (InnerArray, key) => getInfoForEveryInnerArgument(InnerArray));
Promise.all(promiseArrayCombination).then((fullResults) => {
console.log(fullResults);
return fullResults;
})
}
Promise.all
will not work, you could use Array.reduce
to process BigArray
elements, one by one:
BigArray.reduce((promiseChain, currentArray) => {
return promiseChain.then(chainResults =>
Promise.all(currentArray).then(currentResult =>
[...chainResults, currentResult]
)
);
}, Promise.resolve([])).then(arrayOfArraysOfResults => {
// Do something with all results
});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With