Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reading from compressed files in Dataflow

Is there a way (or any kind of hack) to read input data from compressed files? My input consists of a few hundreds of files, which are produced as compressed with gzip and decompressing them is somewhat tedious.

like image 262
G B Avatar asked Jan 26 '26 17:01

G B


2 Answers

Reading from compressed text sources is now supported in Dataflow (as of this commit). Specifically, files compressed with gzip and bzip2 can be read from by specifying the compression type:

TextIO.Read.from(myFileName).withCompressionType(TextIO.CompressionType.GZIP)

However, if the file has a .gz or .bz2 extension, you don't have do do anything: the default compression type is AUTO, which examines file extensions to determine the correct compression type for a file. This even works with globs, where the files that result from the glob may be a mix of .gz, .bz2, and uncompressed.

like image 160
MattL Avatar answered Jan 29 '26 14:01

MattL


The slower performance with my work around was most likely because Dataflow was putting most of the files in the same split so they weren't being processed in parallel. You can try the following to speed things up.

  • Create a PCollection for each file by applying the Create transform multiple times (each time to a single file).
  • Use the Flatten transform to create a single PCollection containing all the files from PCollections representing individual files.
  • Apply your pipeline to this PCollection.
like image 24
Jeremy Lewi Avatar answered Jan 29 '26 14:01

Jeremy Lewi