Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Flink: what's the best way to handle exceptions inside Flink jobs

Tags:

apache-flink

I have a flink job that takes in Kafaka topics and goes through a bunch of operators. I'm wondering what's the best way to deal with exceptions that happen in the middle.

My goal is to have a centralized place to handle those exceptions that may be thrown from different operators and here is my current solution:

Use ProcessFunction and output sideOutput to context in the catch block, assuming there is an exception, and have a separate sink function for the sideOutput at the end where it calls an external service to update the status of another related job

However, my question is that by doing so it seems I still need to call collector.collect() and pass in a null value in order to proceed to following operators and hit last stage where sideOutput will flow into the separate sink function. Is this the right way to do it?

Also I'm not sure what actually happens if I don't call collector.collect() inside a operator, would it hang there and cause memory leak?

like image 627
Sicong Avatar asked Sep 02 '25 15:09

Sicong


1 Answers

It's fine to not call collector.collect(). And you don't need to call collect() with a null value when you use the side output to capture the exception - each operator can have its own side output. Finally, if you have multiple such operators with a side output for exceptions, you can union() the side outputs together before sending that stream to a sink.

If for some reason the downstream operator(s) need to know that there was an exception, then one approach is to output an Either<good result, Exception>, but then each downstream operator would of course need to have code to check what it's receiving.

like image 135
kkrugler Avatar answered Sep 05 '25 14:09

kkrugler