Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Any way to SQLBulkCopy "insert or update if exists"?

I need to update a very large table periodically and SQLBulkCopy is perfect for that, only that I have a 2-columns index that prevents duplicates. Is there a way to use SQLBulkCopy as "insert or update if exists"?

If not, what is the most efficient way of doing so? Again, I am talking about a table with millions of records.

Thank you

like image 580
Sol Avatar asked Feb 03 '11 16:02

Sol


People also ask

Does SqlBulkCopy update?

SqlBulkCopy as the name suggest is for copying (inserting) bulk records and it cannot perform update operation. Hence comes Table Valued Parameter to the rescue, which allows us to pass multiple records using a DataTable to a Stored Procedure where we can do the processing.

What is BatchSize in SqlBulkCopy?

BatchSize = 4000; By default, SqlBulkCopy will process the operation in a single batch. If you have 100000 rows to copy, 100000 rows will be copied at once. Not specifying a BatchSize can impact your application: Decrease SqlBulkCopy performance.

What is the use of SqlBulkCopy command?

The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.


2 Answers

I published a nuget package (SqlBulkTools) to solve this problem.

Here's a code example that would achieve a bulk upsert.

var bulk = new BulkOperations(); var books = GetBooks();  using (TransactionScope trans = new TransactionScope()) {     using (SqlConnection conn = new SqlConnection(ConfigurationManager     .ConnectionStrings["SqlBulkToolsTest"].ConnectionString))     {         bulk.Setup<Book>()             .ForCollection(books)             .WithTable("Books")             .AddAllColumns()             .BulkInsertOrUpdate()             .MatchTargetOn(x => x.ISBN)             .Commit(conn);     }      trans.Complete(); } 

For very large tables, there are options to add table locks and temporarily disable non-clustered indexes. See SqlBulkTools Documentation for more examples.

like image 181
Greg R Taylor Avatar answered Sep 18 '22 06:09

Greg R Taylor


I would bulk load data into a temporary staging table, then do an upsert into the final table. See here for an example of doing an upsert.

like image 27
btilly Avatar answered Sep 21 '22 06:09

btilly