Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Optimal row size to fetch at a time from a big table

I have a very big table contains around 20 million rows. I have to fetch some 4 million rows from this table based on some filtering criteria. All the columns in filtering criteria are covered by some index and table stats are upto date.

I have been suggested that instead of loading all rows in a single go, use a batch size e.g. say 80000 rows at a time and that will be faster compared to loading all the rows at a time.

Can you suggest if this idea makes sense?

If it makes sense, what will be optimal row size to load at a time.

like image 216
MoveFast Avatar asked Nov 13 '22 09:11

MoveFast


1 Answers

  1. It can be much faster than single sql.
  2. Split data using PK.
  3. Batch size. It depends on the length of lines and processing time. Start with 10 000.
  4. Thread job if possible.
like image 140
baklarz2048 Avatar answered Dec 28 '22 11:12

baklarz2048