I have a simple data flow. The source is a small flat file with approxiamtely 16k rows in it. The destination is an OLE DB destination, a SQL 2008 table with a 3 part Unique key on it. The data flow goes via some simple transformations; Row Count, derived columns, data conversion etc.
All simple and all that works fine.
My problem is that within this data there are 2 rows which are duplicates in terms of the primary key, 2 duplicate rows that violate that key, so 4 rows in total. On the OLE DB destination i have set the error output to redirect Row and the rows are sent to an Error table which has enough columns for me to identify the bad rows.
The problem is that even though there are 4 cuplrits the tranformation keeps writing 1268 rows to the error table.
Any ideas?
Thanks.
**
Just to add, if i remove the 2 duplicate rows the whole file imports successfully....16,875 rows. There is no question that only 2 rows violate the key, but the error redirection affects 1268.
**
I have found the solution.
The problem goes away if you load the data using Data access mode 'Table or view' in the OLE Destination rather than 'Table or View - Fast load'.
The only relevant comment i can find is on MSDN;
Any constraint failure at the destination causes the entire batch of rows defined by FastLoadMaxInsertCommitSize to fail.
So it seems that the row size was 1268 in my case and the 2 duplicate rows that were violating the key caused the whole batch to be redirected to the error destination table.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With