I want to insert many rows (constructed from Entity Framework objects) to SQL Server. The problem is, some of string properties have length exceeded length of column in database, which causes an exception, and then all of rows will unable to insert into database.
So I wonder that if there is a way to tell SqlBulkCopy to automatically truncate any over-length rows? Of course, I can check and substring each property if it exceeds the limited length, before insert it in to a DataTable, but it would slow down the whole program.
Always use a staging/load table for bulk actions.
Then you can process, clean, scrub etc the data before flushing to the real table. This includes, LEFTs, lookups, de-duplications etc
So:
INSERT realtable (..) SELECT LEFT(..), .. FROM Staging
Unfortunately there is no direct way of doing that with SqlBulkCopy
. SQL Bulk Inserts are by nature almost "dumb" but that's why they are so fast. They aren't even logged (except capturing SqlRowsCopied
event) so if something fails, there's not much information. What you're looking for would in a way, be counter to the purpose of this class
But there can be 2 possible ways:
You can try using SqlBulkCopyOptions
Enumeration (passed to SqlBulkCopy() Constructor) and use SqlBulkCopyOptions.CheckConstraints
(Check constraints while data is being inserted. By default, constraints are not checked.).
Or you can use SqlBulkCopyOptions.FireTriggers
Enumeration (When specified, cause the server to fire the insert triggers for the rows being inserted into the database.) and handle the exception in SQL Server Insert Trigger
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With