Finally async
/await
will be supported in all major browser soon except IE. So now we can start writing more readable code with async
/await
but there is a catch. A lot of people use async await like this:
const userResponse = await fetchUserAsync(); const postsResponse = await fetchPostsAsync();
While this code is readable it has a problem, it runs the functions in series, it won't start fetching posts until the fetching of the user is finished. The solutions is simple, we need to fetch the resources in parallel.
So what I want to do is (in pseudo language):
fn task() { result-1 = doAsync(); result-2 = doAsync(); result-n = doLongAsync(); // handle results together combinedResult = handleResults(result-1, result-2); lastResult = handleLastResult(result-n); }
Use Promise. all for the parallel function calls, the answer behaviors not correctly when the error occurs. Caveat: It doesn't matter if the await calls are on the same line or on different lines, so long as the first await call happens after all of the asynchronous calls.
Asynchronous operations in parallelThe method async. parallel() is used to run multiple asynchronous operations in parallel. The first argument to async. parallel() is a collection of the asynchronous functions to run (an array, object or other iterable).
Here's a quick example of what running code in parallel in JavaScript looks like. This will execute promise1, promise2, and promise3 in parallel. Then, it will combine the response from each promise together into an array.
For more information, I have an async / await intro on my blog. So additionally, if a method with multiple awaits is called by a caller, the responsibility for finishing every statement of that method is with the caller.
You can write something like this:
const responses = await Promise.all([ fetchUserAsync(), fetchPostsAsync(), ]); const userResponse = responses[0]; const postsResponse = responses[1];
This is easy right? But there is a catch. Promise.all
has fail-fast behaviour which means, it will reject as soon as one of the promises rejected. Probably you want a more robust solution where we are in charge of handling the rejections any of the fetches. Luckily there is a solution, it can be achieved simply with async
/await
without the need of using Promise.all
. A working example:
console.clear(); function wait(ms, data) { return new Promise( resolve => setTimeout(resolve.bind(this, data), ms) ); } /** * This will run in series, because * we call a function and immediately wait for it's result, * so this will finish in 1s. */ async function series() { return { result1: await wait(500, 'seriesTask1'), result2: await wait(500, 'seriesTask2'), } } /** * While here we call the functions first, * then wait for the result later, so * this will finish in 500ms. */ async function parallel() { const task1 = wait(500, 'parallelTask1'); const task2 = wait(500, 'parallelTask2'); return { result1: await task1, result2: await task2, } } async function taskRunner(fn, label) { const startTime = performance.now(); console.log(`Task ${label} starting...`); let result = await fn(); console.log(`Task ${label} finished in ${ Number.parseInt(performance.now() - startTime) } miliseconds with,`, result); } void taskRunner(series, 'series'); void taskRunner(parallel, 'parallel'); /* * The result will be: * Task series starting... * Task parallel starting... * Task parallel finished in 500 milliseconds with, { "result1": "parallelTask1", "result2": "parallelTask2" } * Task series finished in 1001 milliseconds with, { "result1": "seriesTask1", "result2": "seriesTask2" } */
Note: You will need a browser which has async
/await
enabled to run this snippet (or nodejs v7 and above)
This way you can use simply try
/ catch
to handle your errors, and return partial results inside the parallel
function.
If you're ok with the fail-fast behavior of Promise.all and the destructuring assignment syntax:
const [userResponse, postsResponse] = await Promise.all([ fetchUserAsync(), fetchPostsAsync(), ]);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With