In my program source code I have the following function (Promise concurrency limitation function, similar to pLimit):
async function promiseMapLimit(
array,
poolLimit,
iteratorFn,
) {
const ret = [];
const executing = [];
for (const item of array) {
const p = Promise.resolve().then(() => iteratorFn(item, array));
ret.push(p);
if (poolLimit <= array.length) {
const e = p.then(() => executing.splice(executing.indexOf(e), 1));
executing.push(e);
if (executing.length >= poolLimit) {
await Promise.race(executing);
}
}
}
return Promise.all(ret);
}
It works properly, so if I passed it an array of numbers [1..99] and try to multiply it by 2 it will give the correct output [0..198].
const testArray = Array.from(Array(100).keys());
promiseMapLimit(testArray, 20, async (value) => value * 2).then((result) =>
console.log(result)
);
Code sample - js playground.
But I can't understand its logic, during the debugging I noticed, that it adds promises in chunks of 20 and only after that goes further:
For example, this block of code:
for (const item of array) {
const p = Promise.resolve().then(() => iteratorFn(item, array));
ret.push(p);
will iterate over 20 items of an array (why not all 100???)
same here:
if (poolLimit <= array.length) {
const e = p.then(() => executing.splice(executing.indexOf(e), 1));
executing.push(e);
it will add only 20 items to the executing
array and only after that step inside if (executing.length >= poolLimit)
code block.
I'd be very grateful for the explanation of how this function works.
Consuming a Promise The promise in the last section has fulfilled with a value, but you also want to be able to access the value. Promises have a method called then that will run after a promise reaches resolve in the code. then will return the promise's value as a parameter.
The Promise is instantiated with the passage of a function that it invokes during its construction, through which it encloses internal resolve and reject functions. The resolve and reject functions are two sides of the same coin, each signaling that the eventual value of the asynchronous behavior has completed.
Here we need to first declare a Promise by using the Promise syntax, and we will be using the then() method for its execution and then inside that then() method we will create another promise by using the same Promise syntax as illustrated above, and then we will call our result of first inside that new Promise.
The essence of this function is to use reduce starting with an initial value of Promise. resolve([]) , or a promise containing an empty array. This promise will then be passed into the reduce method as promise . This is the key to chaining each promise together sequentially.
The Promise.all method returns a promise when all the promises inside the promises array is resolved. So, this is how you can use promise inside a for each loop. A similar approach can be applied to use promise inside for loop or while in JavaScript or Node.js.
Promise Inside For/ForEach Loop Create a folder called promise_loop. Initialize the project folder using npm. mkdir promise_loop cd promise_loop npm init Create a file called helper.js which will make use of the request module to make the API calls. Install the request module using npm. # install request module npm install --save request
Unfortunately, Promise is fully asynchronous and can’t be made to act synchronously. Except, of course, if we don’t mind some hacking! The solution is to create a recursive function. Here’s the full code: Our delay () function hasn’t changed, and neither has our delays array.
Unfortunately, Promise is fully asynchronous and can’t be made to act synchronously. Except, of course, if we don’t mind some hacking! The solution is to create a recursive function. Here’s the full code: Our delay () function hasn’t changed, and neither has our delays array. The magic begins at the doNextPromise () function.
Very interesting question! I think the important part of the code here is Promise.race(...)
which resolves as soon one of the promises resolves.
I have added a sleep
function with a random factor (up to 6 seconds) to better visualize the way how this works.
The expected functionality is that: we always want 20 promises to be executed in paralel, and once one finishes, the next in the queue would execute.
In a visual way, that would look like this, for a limit of 3 and 10 promises – in the below example you can notice that in every moment in time there are 3 active promises (except when they end):
PromiseID | Start End |
0 [====]
1 [==]
2 [======]
3 [==========]
4 [====]
5 [================]
6 [==]
7 [====]
8 [======]
9 [========]
The code to create the random delay is below:
// Create the utility sleep function
const sleep = x => new Promise(res => setTimeout(res, x))
async function promiseMapLimit(array, poolLimit, iteratorFn) {
const ret = [];
const executing = [];
for (const item of array) {
const p = Promise.resolve().then(() => iteratorFn(item, array));
ret.push(p);
console.log(ret.length)
if (poolLimit <= array.length) {
const e = p.then(() => executing.splice(executing.indexOf(e), 1));
executing.push(e);
if (executing.length >= poolLimit) {
console.log(`Running batch of ${executing.length} promises.`);
await Promise.race(executing);
// As ssoon one of the promise finishes, we continue the loop.
console.log("Resolved one promise.")
}
}
}
return Promise.all(ret);
}
const testArray = Array.from(Array(100).keys());
promiseMapLimit(testArray, 20, async (value) => {
// Log
console.log(`Computing iterator fn for ${value}`)
await sleep(3000 + Math.random() * 3000);
return value * 2
}).then((result) =>
console.log(result)
);
will iterate over 20 items of an array (why not all 100???)
At start, like in the graph, it will not iterate all 100 items, but the first 20 items and then the loop is paused by await Promise.race(...)
(because executing.length >= poolLimit
will be true after iterating 20 items).
Once a promise is fullfiled, it will be removed from the executing
array by executing.splice(executing.indexOf(e), 1)
.
I think things become more clear when having a delay (await sleep(...)
) so that we can simulate a real async operation (such as a database request etc).
Please let me know if there is anything else unclear.
You have await
inside async function. This works roughly as follows:
await
keywordawait
keywordIn your case, it iterates 20 times, then pauses everything once you hit a limit. Then, once at least one promise inside ret
resolved it proceeds.
Next thing that is happening is that once any of the promises resolved, removes itself from array. But since almost everything happens instantaneously, you see that it - resolves all 20 promises, fills with another 20. If you make your iteratorFn
slower with random delays, you'll see, that pool is constantly filling up to 20 and then almost immediately replaces freed space in pool with new promise, while there is at least some elements left.
Let's replace your iteratorFn
with this and call it:
let iter = async (value) => {
// randomly delay each calculation to 1, 2 or 3 seconds
return new Promise(resolve => setTimeout(resolve, [1000, 2000, 3000][Math.floor(Math.random() * 3)], value * 2))
}
promiseMapLimit(testArray, 20, iter).then((result) =>
console.log(result)
);
And let's log amount of elements inside executing
once there is a promise resolved:
if (poolLimit <= array.length) {
const e = p.then(() => {
executing.splice(executing.indexOf(e), 1);
// logging what is left
console.log({l: executing.length})
});
executing.push(e);
if (executing.length >= poolLimit) {
await Promise.race(executing);
}
}
This way in console you will see that logging starts with {l: 19}
, since pool is filled up and then one promise resolved. And it will continue, until the very end where log will go from 19 to 0.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With