I am running one stored procedure asynchronously (I need to run the same SP around 150 times) like this:-
var queryTask = new List<Task>();
for (int i = 0; i < 150; i++)
{
queryTask.Add(da.ExecuteSPAsync("Async" + i.ToString()));
}
Task.WhenAll(queryTask).Wait();
Now, it will create 150 Tasks
and execute them. Can I split these tasks in batches and run them? Will that decrease the load on SQL server side?
Or shall I consider TPL to run it? Like this:-
Parallel.For(0, 150, new ParallelOptions { MaxDegreeOfParallelism = 5 },
x => da.ExecuteSP("PPWith5Threads" + x.ToString()));
Which one is better in terms of performance? This is just an example for demonstation purpose, in actual I am having a collection of custom type on which I need to execute some SP.
So you can use a Semaphore for this. The concept behind a semaphore is the night club bouncer scenario, where the bouncer has a limit to the amount of people (threads) allowed in the club (thread pool) and as people leave (threads finish) other people can enter (threads can continue), up to the limit.
All threads will be started, however it is the WaitAsync()
which is blocking the thread from continuing. The Release()
is signalling a thread re-entering the thread pool.
The delay here gives the effect of batching because each thread is roughly waiting the same amount of time, however, in reality it will be more likely that you'll see a few at a time.
Substitute Delay(5000)
with random int to get a better look.
class Program
{
static void Main(string[] args)
{
var runner = new SprocRunner(new DataAccess());
var threads = new List<Task>();
for (var i = 0; i < 150; i++)
{
threads.Add(runner.ExecuteSp($"Async {i}"));
}
Task.WaitAll(threads.ToArray());
}
}
public class SprocRunner
{
private readonly System.Threading.SemaphoreSlim batcher = new System.Threading.SemaphoreSlim(10, 10);
private readonly DataAccess da;
public SprocRunner(DataAccess da)
{
this.da = da;
}
public async Task ExecuteSp(string asyncTaskName)
{
await batcher.WaitAsync();
try
{
await this.da.ExecuteSP(asyncTaskName);
}
catch (Exception e)
{
}
finally
{
batcher.Release();
}
}
}
public class DataAccess
{
public Task ExecuteSP(string name)
{
Console.WriteLine(name);
return Task.Delay(5000);
}
}
After reading papers from the likes of Stephen Toub, the case is, if you are doing lots of I/O bound tasks then in some cases using Parallel
isn't a problem and it does allow you to get the job done. The things to consider is that thread creation isn't a non-negligible cost, and if you request more threads than are present in the ThreadPool it will have to inject new ones. This becomes a problem if you're in an environment which makes lots of use of threads like ASP.NET. Having tonnes of threads sitting around blocking on I/O work is really really bad and can bring your server to a stand still.
This is where using the Task
abstraction really comes into its own, because you can run all these tasks and have then wait for the I/O response - but really importantly - they won't block any threads (other than the main thread waiting on the results), only once the I/O completes will treads be utilised briefly to process the results.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With