Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it better to pass large inserts to SQL Server as a table valued parameter, or as a string insert statement?

I am writing a .NET application that writes data to SQL Server 2008r2. I have two options for inserting the data, either I can create a large string insert statement, and send it as a text command, or I can collect the data in a .NET DataTable, and pass it as a table valued parameter. What are the benefits and costs of each method?

(I am omitting a good deal of code since I am just asking about the relative benefits, not the specific syntax)

e.g.:

Option 1:

    string insert = @"insert into MyTable (id, val) values
        ( 1, 'a'),(2,'b'),(3,'c'),(4,'d');"

Option 2:

    DataTable dt = new DataTable();
    dt.Columns.Add("id", typeof(int));
    dt.Columns.Add("val", typeof(string));
    ....
    create procedure uspMyProc 
                    @tt ttMyTableType readonly
                as
                begin
                    insert into TestTable1 (id, strValue)
                    select myId, myVal from @tt;
                end"

Thanks for any help.

like image 235
Sako73 Avatar asked May 25 '12 15:05

Sako73


People also ask

Which is faster BCP or bulk insert?

BCP is faster in most cases then BULK Insert.

How many inserts can SQL Server handle?

INSERT VALUES limit is 1000, but it could be overriden with INSERT INTO SELECT FROM VALUES, as for second question in SQL world vast majority of statements are all-or-nothing.

What is table valued parameters in SQL Server?

Table-valued parameters are declared by using user-defined table types. You can use table-valued parameters to send multiple rows of data to a Transact-SQL statement or a routine, such as a stored procedure or function, without creating a temporary table or many parameters.

Can we pass table as parameter in stored procedure?

Table-Valued Parameters aka TVPs are commonly used to pass a table as a parameter into stored procedures or functions. They are helpful in a way, we can use a table as an input to these routines and we can get rid of dealing more complex steps to achieve this process.


1 Answers

Option 3: In the first instance I would populate the insert stored procedure with one insert statement's worth of parameters and call it multiple times in a loop from the C# code:

Option 4: If you truly have lots of rows to insert, perhaps you need to look into the SqlBulkCopy class. It consumes either DataTable, DataRow or an IDataReader. You can make an IDataReader from a list of objects using some custom code, a question of this ilk is asked here:

Get an IDataReader from a typed List


I would say it depends.

If you really want to pass many rows of parameters in tabular form, for whatever reason, use a table valued parameter - that's what it's there for.

I have seen Option 1 - some generic DAL code would script out a SQL "batch" of commands to run. It worked, but didn't give any defence against injection attacks. Parameterised SQL does.


All that said, I would favour calling the insert sproc once for each row to be inserted from code - the calls will be fully parameterised and performance is fine. If performance becomes a problem I would favour Option 4.

like image 179
Adam Houldsworth Avatar answered Sep 19 '22 16:09

Adam Houldsworth