Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Sql bulk copy memory issue

Tags:

c#

sql

sql-server

We are using SqlBulk Copy class in C#. To insert Bulk data in sql. We have a table with 10 million records in it.

We are inserting data in a batch of 10,000 in a loop

We are facing physical memory issue.The memory gets increased and do not get reduced.

Below is our code . How we can release memory when sql bulk copy is used or is there any another way to do bulk insert.

using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null))
{
    //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn);
    bulkCopy.DestinationTableName = DestinationTable;
    bulkCopy.BulkCopyTimeout = 0;
    bulkCopy.BatchSize = dt1.Rows.Count;
    Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info);
    if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken)
        SQlConn.Open();
    bulkCopy.WriteToServer(dt1); //DataTable
    SQlConn.Close();
    SQlConn.Dispose();
    bulkCopy.Close();
    if (bulkCopy != null)
    {
        ((IDisposable)bulkCopy).Dispose();
    }                        
}

Here updating the complete code.

try
        {

            using (SqlConnection SQlConn = new SqlConnection(Common.SQLConnectionString))
            {


                DataTable dt1 = FillEmptyDateFields(dtDestination);

                //SqlTableCreator ObjTbl = new SqlTableCreator(SQlConn);

                //ObjTbl.DestinationTableName = DestinationTable;
                using (System.Data.SqlClient.SqlBulkCopy bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn,SqlBulkCopyOptions.TableLock,null))
                {

                    //bulkCopy = new System.Data.SqlClient.SqlBulkCopy(SQlConn);
                    bulkCopy.DestinationTableName = DestinationTable;
                    bulkCopy.BulkCopyTimeout = 0;
                    bulkCopy.BatchSize = dt1.Rows.Count;
                    Logger.Log("DATATABLE FINAL :" + dt1.Rows.Count.ToString(), Logger.LogType.Info);
                    if (SQlConn.State == ConnectionState.Closed || SQlConn.State == ConnectionState.Broken)
                        SQlConn.Open();
                    bulkCopy.WriteToServer(dt1);
                    SQlConn.Close();
                    SQlConn.Dispose();
                    bulkCopy.Close();
                    if (bulkCopy != null)
                    {
                        ((IDisposable)bulkCopy).Dispose();
                    }                        
                }

            }

            dtDestination.Dispose();
            System.GC.Collect();
            dtDestination = null;
        }
        catch (Exception ex)
        {
            Logger.Log(ex, Logger.LogType.Error);
            throw ex;

        }
like image 291
sunshine Avatar asked Dec 27 '22 16:12

sunshine


2 Answers

The key question here would be: what is dt1, where did it come from, and how have you released it? DataTable is actually quite tricky to clean out, and frankly I wouldn't normally recommend a DataTable source here. However, if you must use DataTable, then make sure and use a completely separate DataSet / DataTable per iteration, and release the old one so it can recycled.

More efficient, however, is to use WriteToServer(IDataReader) - this allows you to handle rows in a streaming fashion. If you are copying between two SQL systems, you could even just use ExecuteReader() on a separate command / connection, but IDataReader is pretty simple, and you can write a basic IDataReader for most sources (or find libraries that do so, for example CsvReader for handling delimited files such as csv/tsv).

like image 101
Marc Gravell Avatar answered Jan 11 '23 17:01

Marc Gravell


I guess the issue is with this line:

bulkCopy.BatchSize = dt1.Rows.Count; 

The BatchSize property determines how many rows are inserted in a single internal transaction. The row size here is potentially unbounded.

http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.batchsize.aspx

Try set it to a small and fixed number should solve the problem:

bulkCopy.BatchSize = 1000;

It's up to you to decide the optimal batch size here.

like image 34
xing Avatar answered Jan 11 '23 15:01

xing