We already have a running system that handles all connection-strings (db2, oracle, MSServer).
Currently, We are using ExecuteNonQuery() to do some inserts.
We want to improve the performance, by using SqlBulkCopy() instead of ExecuteNonQuery(). We have some clients that have more than 50 million records.
We don't want to use SSIS, because our system supports multiple databases.
I created a sample project to test the performance of SqlBulkCopy(). I created a simple read and insert function for MSServer
Here's the small function:
public void insertIntoSQLServer()
{
    using (SqlConnection SourceConnection = new SqlConnection(_sourceConnectionString))
    {
        //Open the connection to get the data from the source table
        SourceConnection.Open();
        using (SqlCommand command = new SqlCommand("select * from " + _sourceSchemaName + "." + _sourceTableName + ";", SourceConnection))
        {
            //Read from the source table
            command.CommandTimeout = 2400;
            SqlDataReader reader = command.ExecuteReader();
            using (SqlConnection DestinationConnection = new SqlConnection(_destinationConnectionString))
            {
                DestinationConnection.Open();
                //Clean the destination table
                new SqlCommand("delete from " + _destinationSchemaName + "." + _destinationTableName + ";", DestinationConnection).ExecuteNonQuery();
                using (SqlBulkCopy bc = new SqlBulkCopy(DestinationConnection))
                {
                    bc.DestinationTableName = string.Format("[{0}].[{1}]", _destinationSchemaName, _destinationTableName);
                    bc.NotifyAfter = 10000;
                    //bc.SqlRowsCopied += bc_SqlRowsCopied;
                    bc.WriteToServer(reader);
                }
            }
        }
    }
}
When I have less that 200 000 in my dummyTable the bulk copy is working fine. But, when it's over 200 000 records, I have the following errors:
OR
I increased the CommandTimeout for the reader. It seems that it has solved the timeout issue related to IDataReader.
Am I doing something wrong in the code?
Can you try adding the following before the call to WriteToServer ...
bc.BatchSize = 10000;
bc.BulkCopyTimeout = 0;
I don't know what the default batch size or timeout is, but I suspect this might be your issue. Hope that helps
Also, you can try playing with different Batch Sizes for optimal performance.
You can try this
bc.BatchSize = 100000; // How many Rows you want to insert at a time
bc.BulkCopyTimeout = 60; // Time in Seconds. If you want infinite waiting Time then assign 0.
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With