I want to know: if I want to send multiple fetch request, that exceeds max browser parallel request support, can I send all request in parallel and browser automatically will handle them? or I should first shrink them to sufficient size bunches and chain them.
Promise.all([allpromisses])
or any other solutions like:
function fetchAll(urls) {
const requestPromises = urls.map(url => {
return fetch(url).then(response => response.json());
});
requestPromises.reduce((chain, requestPromise) => {
return chain.then(() => requestPromise)
.then(data => data);
}, Promise.resolve());
}
or
getBunch([promises1]).then(getBunch([promises2]). ...
If you don't want to continue until all of the fetches return, it's worth just using Promise.all()
and letting the browser handle queuing the requests. One nice thing is that Promise.all()
will "fail fast". Meaning that unless you're handing errors individually, one failed promise will reject the Promise.all()
.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all#Promise.all_fail-fast_behaviour
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With