I'm trying to find a way gracefully end my jobs, so as not to lose any data, streaming from PubSub and writing to BigQuery.
A possible approach I can envision is to have the job stop pulling new data and then run until it has processed everything, but I don't know if/how this is possible to implement.
It appears this feature was added in the latest release.
All you have to do now is select the drain option when cancelling a job.
Thanks.
I believe this would be difficult (if not impossible) to do on your own. We (Google Cloud Dataflow team) are aware of this need and are working on addressing it with a new feature in the coming months.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With