I've got about 100 million rows that I'm moving in SSIS 2008 via a Data Flow Task. It's pretty much a straight table data copy using a Multicast. My question is this:
Using the OLE DB Destination Editor I have two options: Rows per batch and Maximum insert commit size. What are good settings for this? I've only been able to find that you are recommended to set Maximum insert commit size to 2147483647 instead of 0, but then tweak both these settings based on testing. I'm curious to find out if anyone has discovered anything useful in their own management of these values.
Maximum insert commit size - The default value for this setting is '2147483647' (largest value for 4-byte integer type) which specifies all incoming rows will be committed once on successful completion. You can specify a positive value for this setting to indicate that commit will be done for those number of records.
Rows per batch - The default value for this setting is -1 which specifies all incoming rows will be treated as a single batch. You can change this default behavior and break all incoming rows into multiple batches. The allowed value is only positive integer which specifies the maximum number of rows in a batch.
Define the default size of the buffer that the task uses, by setting the DefaultBufferSize property, and define the maximum number of rows in each buffer, by setting the DefaultBufferMaxRows property.
Fast-load data access mode allows one to specify the batch rows and the commit size when inserting to destination. For example, inserting 5 million records would take just over 2 minutes.
I find this useful to guide me: Top 10 SQL Server Integration Services Best Practices
Simply because I don't use SSIS enough. However, as HLGEM said, you'll just have to give a shot yourself to see what happens...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With