I have a DataTable and need the entire thing pushed to a Database table.
I can get it all in there with a foreach and inserting each row at a time. This goes very slow though since there are a few thousand rows.
Is there any way to do the entire datatable at once that might be faster?
The DataTable has less columns than the SQL table. the rest of them should be left NULL.
I discovered SqlBulkCopy is an easy way to do this, and does not require a stored procedure to be written in SQL Server.
Here is an example of how I implemented it:
// take note of SqlBulkCopyOptions.KeepIdentity , you may or may not want to use this for your situation.
using (var bulkCopy = new SqlBulkCopy(_connection.ConnectionString, SqlBulkCopyOptions.KeepIdentity))
{
// my DataTable column names match my SQL Column names, so I simply made this loop. However if your column names don't match, just pass in which datatable name matches the SQL column name in Column Mappings
foreach (DataColumn col in table.Columns)
{
bulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName);
}
bulkCopy.BulkCopyTimeout = 600;
bulkCopy.DestinationTableName = destinationTableName;
bulkCopy.WriteToServer(table);
}
Since you have a DataTable already, and since I am assuming you are using SQL Server 2008 or better, this is probably the most straightforward way. First, in your database, create the following two objects:
CREATE TYPE dbo.MyDataTable -- you can be more speciifc here
AS TABLE
(
col1 INT,
col2 DATETIME
-- etc etc. The columns you have in your data table.
);
GO
CREATE PROCEDURE dbo.InsertMyDataTable
@dt AS dbo.MyDataTable READONLY
AS
BEGIN
SET NOCOUNT ON;
INSERT dbo.RealTable(column list) SELECT column list FROM @dt;
END
GO
Now in your C# code:
DataTable tvp = new DataTable();
// define / populate DataTable
using (connectionObject)
{
SqlCommand cmd = new SqlCommand("dbo.InsertMyDataTable", connectionObject);
cmd.CommandType = CommandType.StoredProcedure;
SqlParameter tvparam = cmd.Parameters.AddWithValue("@dt", tvp);
tvparam.SqlDbType = SqlDbType.Structured;
cmd.ExecuteNonQuery();
}
If you had given more specific details in your question, I would have given a more specific answer.
Consider this approach, you don't need a for loop:
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName =
"dbo.BulkCopyDemoMatchingColumns";
try
{
// Write from the source to the destination.
bulkCopy.WriteToServer(ExistingSqlTableName);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
If can deviate a little from the straight path of DataTable -> SQL table, it can also be done via a list of objects:
1) DataTable -> Generic list of objects
public static DataTable ConvertTo<T>(IList<T> list)
{
DataTable table = CreateTable<T>();
Type entityType = typeof(T);
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(entityType);
foreach (T item in list)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
{
row[prop.Name] = prop.GetValue(item);
}
table.Rows.Add(row);
}
return table;
}
Source and more details can be found here. Missing properties will remain to their default values (0 for int
s, null for reference types etc.)
2) Push the objects into the database
One way is to use EntityFramework.BulkInsert
extension. An EF datacontext is required, though.
It generates the BULK INSERT command required for fast insert (user defined table type solution is much slower than this).
Although not the straight method, it helps constructing a base of working with list of objects instead of DataTable
s which seems to be much more memory efficient.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With