Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do I have to rebuild the indexes on a table after inserting >150,000 records?

I have a daily process which imports about 150,000 records into two tables, and later joins them to compare against other data.

I have created indexes on the tables which makes the comparison process very fast. Unfortunately, after the import process, the fragmentation on my indexes jumps way up, usually well over 50%. I then have to rebuild the indexes before I can run my comparison query.

This doesn't make sense to me. Shouldn't the index update itself appropriately when data is inserted? Are there any properties on the index that would affect this behavior?

edit: Some additional information--I have two tables. During the import process, one table gets about 150,000 rows added to it via a insert/select statement. The other table gets 150,000 rows from SQLBulkCopy at the application level.

like image 223
Slider345 Avatar asked Oct 21 '25 23:10

Slider345


1 Answers

But the index is updating itself properly.
Unless you are adding data exactly in the order of the index the index will fragment.

A few thing to consider:

  • Can you insert the data in the order of the index?
    This will reduce fragmentation.
  • Consider dropping the index, inserting the data, and then rebuilding the index.
    If you are getting 50% fragmentation then this will most likely be faster. A fragmented index will slow down inserts.
  • If you have to leave the index hot then consider a fill factor of 50%.
    This will slow down the rate of fragmentation with inserts significantly.
like image 163
paparazzo Avatar answered Oct 24 '25 13:10

paparazzo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!