Simple: Don't start the BackgroundWorker twice.
You can check if it is already running by using the IsBusy
property, so just change this code:
worker.RunWorkerAsync();
to this:
if( !worker.IsBusy )
worker.RunWorkerAsync();
else
MessageBox.Show("Can't run the worker twice!");
Update:
If you do actually need to launch multiple background tasks at the same time, you can simply create multiple BackgroundWorker objects
Create a new BackgroundWorker object for each operation that you want to perform. I.e., rather than:
BackgroundWorker worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(worker_DoWork);
for (int i; i < max; i++) {
worker.RunWorkerAsync(i);
}
Try this:
for (int i; i < max; i++) {
BackgroundWorker worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(worker_DoWork);
worker.RunWorkerAsync(i);
}
I would look into queue'ing the tasks that need to be done. You get the following advantages;
Here is an example implementation: http://thevalerios.net/matt/2008/05/a-queued-backgroundworker. I am not sure if the implementation in threadsafe, and I will update my answer once I figure out of my current locking problem in a implementation I am working with.
Although not the case originally asked by the OP, this can also happen due to a race condition (happened to me now, and looked for an answer) if you're using a Background worker in some sort of a producer-consumer pattern.
Example:
if (BckgrndWrkr == null)
{
BckgrndWrkr = new BackgroundWorker();
BckgrndWrkr.DoWork += DoWorkMethod;
BckgrndWrkr.RunWorkerAsync();
}
else if (!BckgrndWrkr.IsBusy)
{
BckgrndWrkr.RunWorkerAsync();
}
In this case there's a race condition: first instance instantiates a new background worker, 2nd instance reaches the else if
and starts the background worker, before 1st instance reaches the RunWorkerAsync of the if
block, and when it does it throws the error.
This can be avoided by adding a lock to the entire if + if else section.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With