I have a very big table contains around 20 million rows. I have to fetch some 4 million rows from this table based on some filtering criteria. All the columns in filtering criteria are covered by some index and table stats are upto date.
I have been suggested that instead of loading all rows in a single go, use a batch size e.g. say 80000 rows at a time and that will be faster compared to loading all the rows at a time.
Can you suggest if this idea makes sense?
If it makes sense, what will be optimal row size to load at a time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With