I have a datatable with around 3000 rows. Each of those rows need to be inserted in a database table. Currently, i am running a foreach loop as under:
obj_AseCommand.CommandText = sql_proc;
obj_AseCommand.CommandType = CommandType.StoredProcedure;
obj_AseCommand.Connection = db_Conn;
obj_AseCommand.Connection.Open();
foreach (DataRow dr in dt.Rows)
{
obj_AseCommand.Parameters.AddWithValue("@a", dr["a"]);
obj_AseCommand.Parameters.AddWithValue("@b", dr["b"]);
obj_AseCommand.Parameters.AddWithValue("@c", dr["c"]);
obj_AseCommand.ExecuteNonQuery();
obj_AseCommand.Parameters.Clear();
}
obj_AseCommand.Connection.Close();
Can you please advise how can I do parallelly execute the SP in database since the above approach takes about 10 minutes to insert 3000 rows.
Edit
In hindsight, using a Parallel.ForEach
to parallelize DB insertions is slightly wasteful, as it will also consume a thread for each Connection. Arguably, an even better parallel solution would be to use the asynchronous versions of the System.Data
Db Operations, such as ExecuteNonQueryAsync , start the executions (concurrently), and then use await Task.WhenAll()
to wait upon completion - this will avoid the Thread overhead to the caller, although the overall Db performance won't likely be any quicker. More here
Original Answer, multiple Parallel Inserts into Database
You can do this in parallel using TPL, e.g. specifically with the localInit
overload of Parallel.ForEach. You will almost certainly want to look at throttling the amount of parallelism by tweaking MaxDegreeOfParalelism so that you don't inundate your database:
Parallel.ForEach(dt.Rows,
// Adjust this for optimum throughput vs minimal impact to your other DB users
new ParallelOptions { MaxDegreeOfParallelism = 4 },
() =>
{
var con = new SqlConnection();
var cmd = con.CreateCommand();
cmd.CommandText = sql_proc;
cmd.CommandType = CommandType.StoredProcedure;
con.Open();
cmd.Parameters.Add(new SqlParameter("@a", SqlDbType.Int));
// NB : Size sensitive parameters must have size
cmd.Parameters.Add(new SqlParameter("@b", SqlDbType.VarChar, 100));
cmd.Parameters.Add(new SqlParameter("@c", SqlDbType.Bit));
// Prepare won't help with SPROCs but can improve plan caching for adhoc sql
// cmd.Prepare();
return new {Conn = con, Cmd = cmd};
},
(dr, pls, localInit) =>
{
localInit.Cmd.Parameters["@a"] = dr["a"];
localInit.Cmd.Parameters["@b"] = dr["b"];
localInit.Cmd.Parameters["@c"] = dr["c"];
localInit.Cmd.ExecuteNonQuery();
return localInit;
},
(localInit) =>
{
localInit.Cmd.Dispose();
localInit.Conn.Dispose();
});
Notes:
.Prepare()
if you are using AdHoc Sql or Sql versions prior to 2005
DataTable's
rows is thread safe. You'll want to double check this of course.Side Note:
10 minutes for 3000 rows is excessive even with a wide table and a single thread. What does your proc do? I've assumed the processing isn't trivial, hence the need for the SPROC, but if you are just doing simple inserts, as per @3dd's comment, SqlBulkCopy will yield inserts of ~ 1M rows per minute on a reasonably narrow table.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With