Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What’s the best way to bulk database inserts from c#?

Tags:

c#

sql-server

People also ask

How can I speed up bulk insert?

Below are some good ways to improve BULK INSERT operations : Using TABLOCK as query hint. Dropping Indexes during Bulk Load operation and then once it is completed then recreating them. Changing the Recovery model of database to be BULK_LOGGED during the load operation.

Is bulk insert faster than insert?

In case of BULK INSERT, only extent allocations are logged instead of the actual data being inserted. This will provide much better performance than INSERT. The actual advantage, is to reduce the amount of data being logged in the transaction log.

What is the best and fast way to insert 2 million rows of data into SQL Server?

You can try with SqlBulkCopy class. Lets you efficiently bulk load a SQL Server table with data from another source.

How bulk add data in database?

The basic syntax for bulk importing data is: INSERT ... SELECT * FROM OPENROWSET(BULK...) When used in an INSERT statement, OPENROWSET(BULK...)


CsharperGuyInLondon, here's a simple example of SqlBulkCopy code:

using System.Data.SqlClient;

DataTable table = new DataTable("States");
// construct DataTable
table.Columns.Add(new DataColumn("id_state", typeof(int))); 
table.Columns.Add(new DataColumn("state_name", typeof(string)));

// note: if "id_state" is defined as an identity column in your DB,
// row values for that column will be ignored during the bulk copy
table.Rows.Add("1", "Atlanta");
table.Rows.Add("2", "Chicago");
table.Rows.Add("3", "Springfield");

using(SqlBulkCopy bulkCopy = new SqlBulkCopy(connectionString))
{
  bulkCopy.BulkCopyTimeout = 600; // in seconds
  bulkCopy.DestinationTableName = "state";
  bulkCopy.WriteToServer(table);
}

Well, 10 items isn't what I call bulk, but for larger sets, SqlBulkCopy is your friend. All you need to do is feed it either a DataTable or an IDataReader (my preferred option, 'cos I like streaming APIs). I did something similar here (you can ignore the xml side - just subclass the SimpleDataReader).


The .NET SqlBulkCopy class works quite well.


Re the solution for SqlBulkCopy, I created a class than takes Datatable or a List<T> and a Buffer size (CommitBatchSize). It will convert the list to a data table using an extension (in the second class).

It works very fast. On my PC, I am able to insert more than 10 million complicated records in less than 10 seconds.

Here is the class:

using System;
using System.Collections;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Data.SqlClient;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace DAL
{

public class BulkUploadToSql<T>
{
    public IList<T> InternalStore { get; set; }
    public string TableName { get; set; }
    public int CommitBatchSize { get; set; }=1000;
    public string ConnectionString { get; set; }

    public void Commit()
    {
        if (InternalStore.Count>0)
        {
            DataTable dt;
            int numberOfPages = (InternalStore.Count / CommitBatchSize)  + (InternalStore.Count % CommitBatchSize == 0 ? 0 : 1);
            for (int pageIndex = 0; pageIndex < numberOfPages; pageIndex++)
                {
                    dt= InternalStore.Skip(pageIndex * CommitBatchSize).Take(CommitBatchSize).ToDataTable();
                BulkInsert(dt);
                }
        } 
    }

    public void BulkInsert(DataTable dt)
    {
        using (SqlConnection connection = new SqlConnection(ConnectionString))
        {
            // make sure to enable triggers
            // more on triggers in next post
            SqlBulkCopy bulkCopy =
                new SqlBulkCopy
                (
                connection,
                SqlBulkCopyOptions.TableLock |
                SqlBulkCopyOptions.FireTriggers |
                SqlBulkCopyOptions.UseInternalTransaction,
                null
                );

            // set the destination table name
            bulkCopy.DestinationTableName = TableName;
            connection.Open();

            // write the data in the "dataTable"
            bulkCopy.WriteToServer(dt);
            connection.Close();
        }
        // reset
        //this.dataTable.Clear();
    }

}

public static class BulkUploadToSqlHelper
{
    public static DataTable ToDataTable<T>(this IEnumerable<T> data)
    {
        PropertyDescriptorCollection properties =
            TypeDescriptor.GetProperties(typeof(T));
        DataTable table = new DataTable();
        foreach (PropertyDescriptor prop in properties)
            table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
        foreach (T item in data)
        {
            DataRow row = table.NewRow();
            foreach (PropertyDescriptor prop in properties)
                row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
            table.Rows.Add(row);
        }
        return table;
    }
}

}

Here is an example when I want to insert a List of my custom object List<PuckDetection> (ListDetections):

var objBulk = new BulkUploadToSql<PuckDetection>()
{
        InternalStore = ListDetections,
        TableName= "PuckDetections",
        CommitBatchSize=1000,
        ConnectionString="ENTER YOU CONNECTION STRING"
};
objBulk.Commit();

You can build a BLOB (image) and send it as a parameter to a stored procedure. Inside the stored procedure, you can fetch all the items using substring().


Here's a good example of SqlBulkCopy in action:

http://blogs.msdn.com/nikhilsi/archive/2008/06/11/bulk-insert-into-sql-from-c-app.aspx


I construct the list as an xml string and pass it to the stored proc. In SQL 2005, it has enhanced xml functionalities to parse the xml and do a bulk insert.

check this post: Passing lists to SQL Server 2005 with XML Parameters