Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Delete files after processing with Spark Structured Streaming

I am using the file source in Spark Structures Streaming and want to delete the files after I process them.

I am reading in a directory filled with JSON files (1.json, 2.json, etc) and then writing them as Parquet files. I want to remove each file after it successfully processes it.

like image 388
saul.shanabrook Avatar asked Apr 28 '17 04:04

saul.shanabrook


2 Answers

EDIT 2: Changed my go script to read sources instead. new script

EDIT: Trying this out currently, and it might be deleting files before they are processed. Currently looking for a better solution and investigating this method.

I solved this temporarily by creating a Go script. It will scan the checkpoints folder that I set in Spark and process the files in that to figure out which files have been written out of Spark already. It will then delete them if they exist. It does this every 10 seconds.

However, relies on Spark's checkpoint file structure and representation (JSON), which is not documented and could change at any point. I also have not looked through the Spark source code to see if the files I am reading (checkpoint/sources/0/...), are the real source of truth for processed files. Seems to be working ATM though! Better than doing it manually at this point.

like image 181
saul.shanabrook Avatar answered Oct 13 '22 21:10

saul.shanabrook


It is now possible in Spark 3. You can use "cleanSource" option for readStream.

Thanks to documentation https://spark.apache.org/docs/latest/structuread-streaming-programming-guide.html and this video https://www.youtube.com/watch?v=EM7T34Uu2Gg.

After searching for many hours, finally got the solution

like image 26
Mr AK Avatar answered Oct 13 '22 21:10

Mr AK