I am relatively new to .NET programming and multithreading in general, and was wondering if it is ok to use .NET provided BackgroundWorker to spawn off worker threads to do some work in a console application? From the various documentation online, I see that intent for this class is more for UI oriented applications, where you want to do some work in background, but keep the UI responsive, and report progress, cancelling processing if needed etc.
In my case, basically I have a controller class where I want to spawn off multiple worker threads to do some processing (limiting the max number of worker threads spawned using a semaphore). Then I want my controller class to block until all the threads have completed processing. So after I start a worker thread to do some work, I want the thread to be able to notify the controller thread when processing is complete. I see that I can use the background worker class, and handle events DoWork and RunWorkerCompleted to accomplish this, however was wondering if this is good idea? Are there better ways to achieve this?
If your requirement is just to block until all the threads have finished, that's really easy - just start new threads and then call Thread.Join
on each of them:
using System;
using System.Collections.Generic;
using System.Threading;
public class Test
{
static void Main()
{
var threads = new List<Thread>();
for (int i = 0; i < 10; i++)
{
int copy = i;
Thread thread = new Thread(() => DoWork(copy));
thread.Start();
threads.Add(thread);
}
Console.WriteLine("Main thread blocking");
foreach (Thread thread in threads)
{
thread.Join();
}
Console.WriteLine("Main thread finished");
}
static void DoWork(int thread)
{
Console.WriteLine("Thread {0} doing work", thread);
Random rng = new Random(thread); // Seed with unique numbers
Thread.Sleep(rng.Next(2000));
Console.WriteLine("Thread {0} done", thread);
}
}
EDIT: If you have access to .NET 4.0, then the TPL is definitely the right way to go. Otherwise, I would suggest using a producer/consumer queue (there's plenty of sample code around). Basically you have a queue of work items, and as many consumer threads as you have cores (assuming they're CPU-bound; you'd want to tailor it to your work load). Each consumer thread would take items from the queue and process them, one at a time. Exactly how you manage this will depend on your situation, but it's not terribly complicated. It's even easier if you can come up with all the work you need to do to start with, so that threads can just exit when they find the queue is empty.
This won't work in a Console Application, since SynchronizationContext.Current will never be initialized. This is initialized by Windows Forms or WPF for you, when you're using a GUI application.
That being said, there's no reason to do this. Just use ThreadPool.QueueUserWorkItem, and a reset event (ManualResetEvent or AutoResetEvent) to trap the completion state, and block your main thread.
Edit:
After seeing some of the OP comments, I thought I'd add this.
The "nicest" alternative, in my opinion, would be to get a copy of the Rx Framework, since it includes a backport of the TPL in .NET 4. This would allow you to use the overload of Parallel.ForEach which provides the option of supplying a ParallelOptions instance. This will allow you to restrict the total number of concurrent operations, and handle all of the work for you:
// using collection of work items, such as: List<Action<object>> workItems;
var options = new ParallelOptions();
options.MaxDegreeOfParallelism = 10; // Restrict to 10 threads, not recommended!
// Perform all actions in the list, in parallel
Parallel.ForEach(workItems, options, item => { item(null); });
However, using Parallel.ForEach, I'd personally let the system manage the degree of parallelism. It will automatically assign an appropriate number of threads (especially when/if this moves to .NET 4).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With