Problem statement : How to parallelize inserts in SQL Server (2008)
I am performing massive numeric computation for scientific research in C# multithreaded workers that basically do one thing : Test thousands of possible configurations (matrix combinations) through a time period (in days) and store the results into an SQL Server Database.
If i store the results one by one into DB (~300.000 lines per computing session * 100's of sessions), one after the other, I end up waiting for hours for the storing process to end.
The database design is very simple :
Each "Combination Set" is tested against sample days and its per-day results are processed in a single C# thread, where a LINQ/SQL query is generated and sent to DB just before the end of the thread. Except combination set IDs sequences, there is NO logical relation between Results. This is very important : This is why I thought of parallelizing the insert stuff as it basically amounts to a bulk dump of result blocks
Another detail that could be important is that it is possible to determine beforehand how much rows will be inserted into the Database (per block and in total). This probably could help organize table spaces, split them through pages, pre-fix id ranges in order to store blocks simultaneously, or something like that (No, i'm not "high" or something :-) )
I welcome any kind of suggestions in order to make this insert time as short as possible.
Please take into account that I am a C# developer, with very basic SQL Server knowledge and not very familiar with deep technical DBA concepts (I saw that Locking tweaks are VERY numerous, that there are multithreaded and asynchronous capabilities, too, but I have to admit I am lost alone in the forest :-) )
I Have 12 CPU Cores available, and 24Go RAM
EDIT:
Tiebreaker
I welcome any clever suggestion on monitoring time for the whole process : From C# threads inception/end to detailed SQl server insert reports (What happens when, how, and where).
I tried logging whith NLog but it drastically biases the processing time so I am looking for some smart workarounds that are pretty seamless with minimum impact. Same for the SQL server part : I know there are a couple of Logs and monitoring SP's available. I did not figure out yet which ones suit my situation.
300k inserts is a matter of seconds, at worst minutes, not hours. You must be doing it wrong. The ETL SSIS world record back in 2008 was at 2.36 TB/hour, 300k records is nothing.
The basic rules of thumb are:
Pseudocode:
do
{
using (TransactionScope scope = new TransactionScope(
Required, new TransactionOptions() {IsolationLevel = ReadCommitted))
{
for (batchsize)
{
ExecuteNonQuery ("Insert ...")
}
scope.Complete ();
}
} while (!finished);
SqlBulkCopy
The first option alone will get you above 3000 inserts per second (~2 minutes for 300k). Second option should get you into tens of thousands per second range. If you need more, there are more advanced tricks:
I suggest you start with the basics of the basics: batch commits.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With