Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

The difference on reading files in PySpark between reading the whole directory then filtering and reading a part of the directory?

Suppose I have a data model that runs daily and the sample HDFS path is

data_model/sales_summary/grass_date=2021-04-01

If I want to read all the models in Feb and March, what is the difference if I read in the following two ways:

A:

spark.read.parquet('data_model/sales_summary/grass_date=2021-0{2,3}*')

B:

spark.read.parquet('data_model/sales_summary/').filter(col('grass_date').between('2021-02-01', '2021-03-30'))

Are these two reading methods equivalent? If not, under what circumstances which one can be more efficient?

like image 478
Weichen Zhang Avatar asked Nov 07 '25 02:11

Weichen Zhang


1 Answers

Spark will do a partition filter when reading the files, so the performance of the two methods should be similar. The query plans below show how the partition filters are used in the filescan operation.

spark.read.parquet('data_model/sales_summary/grass_date=2021-0{2,3}*').explain()
== Physical Plan ==
*(1) ColumnarToRow
+- FileScan parquet [id#18] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/tmp/data_model/sales_summary/grass_date=2021-02-21, file:/tmp/data_model/..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<id:int>

spark.read.parquet('data_model/sales_summary/').filter(F.col('grass_date').between('2021-02-01', '2021-03-30')).explain()
== Physical Plan ==
*(1) ColumnarToRow
+- FileScan parquet [id#24,grass_date#25] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/tmp/data_model/sales_summary], PartitionFilters: [isnotnull(grass_date#25), (grass_date#25 >= 18659), (grass_date#25 <= 18716)], PushedFilters: [], ReadSchema: struct<id:int>

But note that the partitioning column will be missing from the dataframe if you use the first method to read the files, so you'd probably prefer the second method.

like image 114
mck Avatar answered Nov 09 '25 00:11

mck



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!