Why does the following code report an Uncaught (in promise) rejected error when it is being caught?
function Test() {
this.start = function(action) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (action == "fail") return reject("rejected");
resolve("resolved");
}, 1000);
});
}
}
const test = new Test();
const promise = test.start("fail");
promise.then(console.log);
promise.catch(console.error);
The output (in a browser) is:
rejected from the catch()
Uncaught (in promise) rejected from the call to reject
You're forking your promise chain:
promise ---> .then( .. ) ---> ???
\
\
+-> .catch( .. )
The rejection is being caught by the .catch just fine, but it also goes through the .then branch. Both of those branches are resolved independently. And you're not catching the rejection on the .then branch, leading to the uncaught rejection error.
Compare with:
promise.then(console.log).catch(console.error)
Here the chain looks like this:
promise ---> .then( .. ) ---> .catch( .. )
The rejection will skip the .then handler and go to the nearest .catch.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With