Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Fastest way to insert 1 million rows in SQL Server [duplicate]

Tags:

c#

sql

sql-server

I am writing a stored procedure to insert rows into a table. The problem is that in some operation we might want to insert more than 1 million rows and we want to make it fast. Another thing is that in one of the column, it is Nvarchar(MAX). We might want to put avg 1000 characters in this column.

Firstly, I wrote a prc to insert row by row. Then I generate some random data for insert with the NVARCHAR(MAX) column to be a string of 1000 characters. Then use a loop to call the prc to insert the rows. The perf is very bad which takes 48 mins if I use SQL server to log on the database server to insert. If I use C# to connect to the server in my desktop (that is what we usually want to do ), it takes about more than 90mins.

Then, I changed the prc to take a table type parameter as the input. I prepared the rows somehow and put them in the table type parameter and do the insert by the following command:

INSERT INTO tableA SELECT * from @tableTypeParameterB

I tried batch size as 1000 rows and 3000 rows (Put 1000-3000 rows in the @tableTypeParameterB to be inserted for one time). The performance is still bad. It takes about 3 mins to insert 1 million rows if I run it in the SQL server and take about 10 mins if I use C# program to connect from my desktop.

The tableA has a clustered index with 2 columns.

My target is to make the insert as fast as possible (My idea target is within 1 min). Is there any way to optimize it?


Just an update:

I tried the Bulk Copy Insert which was suggested by some people below. I tried use the SQLBULKCOPY to insert 1000 row and 10000 row at a time. The performance is still 10 mins to insert 1 million row (Every row has a column with 1000 characters). There is no performance improve. Is there any other suggestions?


An update based on the comments require.

The data is actually coming from UI. The user will change use UI to bulk select, we say, one million rows and change one column from the old value to new value. This operation will be done in a separate procedure.But here what we need to do is that make the mid-tier service to get the old value and new value from the UI and insert them in the table. The old value and new value may be up to 4000 characters and the average is 1000 characters. I think the long string old/new value slow down the speed because when I change the test data old value/new value to 20-50 characters and insert is very fast no matter use SQLBulkCopy or table type variable

like image 612
Mandy Avatar asked Jul 22 '14 00:07

Mandy


1 Answers

I think what you are looking for is Bulk Insert if you prefer using SQL.

Or there is also the ADO.NET for Batch Operations option, so you keep the logic in your C# application. This article is also very complete.

Update

Yes I'm afraid bulk insert will only work with imported files (from within the database).

I have an experience in a Java project where we needed to insert millions of rows (data came from outside the application btw).

Database was Oracle, so of course we used the multi-line insert of Oracle. It turned out that the Java batch update was much faster than the multi-valued insert of Oracle (so called "bulk updates").

My suggestion is:

  • Compare the performance between the multi-value insert of SQL Server code (then you can read from inside your database, a procedure if you like) with the ADO.NET Batch Insert.

If the data you are going to manipulate is coming from outside your application (if it is not already in the database), I would say just go for the ADO.NET Batch Inserts. I think that its your case.

Note: Keep in mind that batch inserts usually operate with the same query. That is what makes them so fast.

like image 138
Evandro Pomatti Avatar answered Oct 08 '22 17:10

Evandro Pomatti