Since .NET 4.0 the autogenerated add/remove event handlers are thread safe (here and here). Therefore the clients who register their listeners to an exposed event can do so concurrently from multiple threads without races.
But what if I want to fire the event in a thread safe manner? The recommended practice seems to be the following (here):
public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
EventHandler myEvent = MyEvent;
if (myEvent != null)
{
myEvent(this, e);
}
}
However, having read something about the .NET memory model (e.g. MSDN magazine 2012-12 and 2013-01) I no longer think this is correct. My concern is that memory reads may be introduced by the compiler and so the above code could be JIT-ted to something like this:
public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
// JIT removed the local variable and introduced two memory reads instead.
if (MyEvent != null)
{
// A race condition may cause the following line to throw a NullReferenceException.
MyEvent(this, e);
}
}
It is legal to remove the local variable and use repeated memory reads since it does not change the behavior of the method if executed in a single threaded environment. This is by the ECMA specification (ECMA-335: I.12.6.4). Comprehensible example is also provided in the 2013-01 issue of the MSDN magazine.
Am I missing something here? If not then please advise a workaround.
You have to add the only line to make the first snippet correct in multithreaded environment:
public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
EventHandler myEvent = MyEvent;
Thread.MemoryBarrier();
if (myEvent != null)
{
myEvent(this, e);
}
}
Memory barrier refuses to reorder read and writes for both compiler and CPU. That's how volatile reads/writes are implemented. You can read more about memory barrier here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With