Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cannot SQLBulkCopy Error 40197 with %d code of 4815 (Connection Forcibly Closed)

Developing with VS 2013 ASP.NET MVC 5 Web Project and Separate Azure hosted SQL Server database.

At the bottom is all my error information from Visual Studio 2013. I've narrowed down the problem and found a link to the Microsoft Description of the problem without a solution. I'm Developing with Database First and Entity Framework 6. ASP.NET 4 MVC & Razor. I connect to a SQL Azure database - I think this is whats falling over i've checked the logs for Azure website etc already

I have delimited text files (that were uploaded to APP_DATA) that I load into a DataTable then use SQL-Bulk Copy to dump content into Azure Database. All works 100% fine so long as my files are only containing a few hundred records. But I need to insert 20MB files with approx 200,000 rows. When I try the big files I get an Error at the point ASP.NET is performing the Bulk Copy. No matter what I set for batch size etc it bails around the 4000 row mark every-time. I've exhausted all options and at my whits end, I even tried Scaling up the Azure database to Business from FREE web. I tried scaling up the website too. Here is the code :

public void BatchBulkCopy(DataTable dataTable, string DestinationTbl, int batchSize,int identity)
{
    try { 

    // Set the timeout.
    System.Diagnostics.Debug.WriteLine("Start SQL Bulk Copy");

    using (SqlBulkCopy sbc = new SqlBulkCopy("Server=tcp:eumtj4loxy.database.windows.net,1433;Database=AscWaterDB;User ID=HIDDEN@HIDDEN;Password=XXXXXXX;Trusted_Connection=False;Encrypt=True;Connection Timeout=900;", SqlBulkCopyOptions.TableLock))
        {
        sbc.DestinationTableName = DestinationTbl;
        sbc.BulkCopyTimeout = 0;

        // Number of records to be processed in one go
        sbc.BatchSize = 1000;

        // Add your column mappings here
        sbc.ColumnMappings.Add("D2001_SPID", "SupplyPointId");
        sbc.ColumnMappings.Add("D2002_ServiceCategory", "D2002_ServiceCategory");
        sbc.ColumnMappings.Add("D2025_NotifyDisconnection/Reconnection", "D2025_NotifyDisconnectionReconnection");
        sbc.ColumnMappings.Add("WaterBatchId", "WaterBatchId");
        sbc.ColumnMappings.Add("D2003_Schedule3", "D2003_Schedule3");
        sbc.ColumnMappings.Add("D2004_ExemptCustomerFlag", "D2004_ExemptCustomerFlag");
        sbc.ColumnMappings.Add("D2005_CustomerClassification", "D2005_CustomerClassification");
        sbc.ColumnMappings.Add("D2006_29e", "D2006_29e");
        sbc.ColumnMappings.Add("D2007_LargeVolAgreement", "D2007_LargeVolAgreement");
        sbc.ColumnMappings.Add("D2008_SICCode", "D2008_SICCode");
        sbc.ColumnMappings.Add("D2011_RateableValue", "D2011_RateableValue");
        sbc.ColumnMappings.Add("D2015_SPIDVacant", "D2015_SPIDVacant");
        sbc.ColumnMappings.Add("D2018_TroughsDrinkingBowls", "D2018_TroughsDrinkingBowls");
        sbc.ColumnMappings.Add("D2019_WaterServicesToCaravans", "D2019_WaterServicesToCaravans");
        sbc.ColumnMappings.Add("D2020_OutsideTaps", "D2020_OutsideTaps");
        sbc.ColumnMappings.Add("D2022_TransitionalArrangements", "D2022_TransitionalArrangements");
        sbc.ColumnMappings.Add("D2024_Unmeasurable", "D2024_Unmeasurable");
        sbc.ColumnMappings.Add("D2014_FarmCroft", "D2014_FarmCroft");

        // Finally write to server
        System.Diagnostics.Debug.WriteLine("Write Bulk Copy to Server " + DateTime.Now.ToString());
        sbc.WriteToServer(dataTable); // Fails here when I upload a 20MB CSV with 190,000 rows
        sbc.Close();
    }

    // Ignore this I don't get to this code unless loading a file thats only got a few records
    WaterBatch obj = GetWaterBatch(identity); // Now we can get the WaterBatch
    obj.StopDateTime = DateTime.Now;
    Edit(obj);
    Save();
    System.Diagnostics.Debug.WriteLine("Finished " + DateTime.Now.ToString());

    }
    catch (Exception ex)
    {
        Exception ex2 = ex;


        while (ex2.InnerException != null)
        {
            ex2 = ex2.InnerException;
        }
        Console.WriteLine(ex.InnerException);
        throw;
    }
}

My $Exception says :

$exception  {"A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)"}    System.Exception {System.Data.SqlClient.SqlException}

My InnerException is, if I go into Inner then Inner exception etc its the same message with Hresult of -2146232060 then -2147467259:

InnerException  {"An existing connection was forcibly closed by the remote host"}   System.Exception {System.ComponentModel.Win32Exception}

UPDATED INFO :

Explanation of Error from Microsoft is (below). I am getting an Error number 40197. Then Microsoft say to look for the %d code - which I get to be 4815. Question is what now, where can I go from here to get into on a 40197 with a %d of 4815:

I got the following info regarding my error from this link: http://msdn.microsoft.com/en-us/library/windowsazure/ff394106.aspx

40197 17 The service has encountered an error processing your request. Please try again. Error code %d. You will receive this error, when the service is down due to software or hardware upgrades, hardware failures, or any other failover problems. The error code (%d) embedded within the message of error 40197 provides additional information about the kind of failure or failover that occurred. Some examples of the error codes embedded within the message of error 40197 are 40020, 40143, 40166, and 40540. Reconnecting to your SQL Database server will automatically connect you to a healthy copy of your database. Your application must catch error 40197, log the embedded error code (%d) within the message for troubleshooting, and try reconnecting to SQL Database until the resources are available, and your connection is established again.

like image 599
John Avatar asked Feb 14 '23 15:02

John


1 Answers

I was getting the exact same error during a Bulk Insert. In my case, it was a varchar column that was overflowing. I just needed to increase the character limit and the problem was solved.

like image 188
Guilherme Avatar answered Feb 17 '23 12:02

Guilherme