It is generally accepted (I believe!) that a lock
will force any values from fields to be reloaded (essentially acting as a memory-barrier or fence - my terminology in this area gets a bit loose, I'm afraid), with the consequence that fields that are only ever accessed inside a lock
do not themselves need to be volatile
.
(If I'm wrong already, just say!)
A good comment was raised here, questioning whether the same is true if code does a Wait()
- i.e. once it has been Pulse()
d, will it reload fields from memory, or could they be in a register (etc).
Or more simply: does the field need to be volatile
to ensure that the current value is obtained when resuming after a Wait()
?
Looking at reflector, Wait
calls down into ObjWait
, which is managed internalcall
(the same as Enter
).
The scenario in question was:
bool closing;
public bool TryDequeue(out T value) {
lock (queue) { // arbitrary lock-object (a private readonly ref-type)
while (queue.Count == 0) {
if (closing) { // <==== (2) access field here
value = default(T);
return false;
}
Monitor.Wait(queue); // <==== (1) waits here
}
...blah do something with the head of the queue
}
}
Obviously I could just make it volatile
, or I could move this out so that I exit and re-enter the Monitor
every time it gets pulsed, but I'm intrigued to know if either is necessary.
Since the Wait()
method is releasing and reacquiring the Monitor
lock, if lock
performs the memory fence semantics, then Monitor.Wait()
will as well.
To hopefully address your comment:
The locking behavior of Monitor.Wait()
is in the docs (http://msdn.microsoft.com/en-us/library/aa332339.aspx), emphasis added:
When a thread calls Wait, it releases the lock on the object and enters the object's waiting queue. The next thread in the object's ready queue (if there is one) acquires the lock and has exclusive use of the object. All threads that call
Wait
remain in the waiting queue until they receive a signal from Pulse orPulseAll
, sent by the owner of the lock. IfPulse
is sent, only the thread at the head of the waiting queue is affected. IfPulseAll
is sent, all threads that are waiting for the object are affected. When the signal is received, one or more threads leave the waiting queue and enter the ready queue. A thread in the ready queue is permitted to reacquire the lock.This method returns when the calling thread reacquires the lock on the object.
If you're asking about a reference for whether a lock
/acquired Monitor
implies a memory barrier, the ECMA CLI spec says the following:
12.6.5 Locks and Threads:
Acquiring a lock (
System.Threading.Monitor.Enter
or entering a synchronized method) shall implicitly perform a volatile read operation, and releasing a lock (System.Threading.Monitor.Exit
or leaving a synchronized method) shall implicitly perform a volatile write operation. See §12.6.7.
12.6.7 Volatile Reads and Writes:
A volatile read has "acquire semantics" meaning that the read is guaranteed to occur prior to any references to memory that occur after the read instruction in the CIL instruction sequence. A volatile write has "release semantics" meaning that the write is guaranteed to happen after any memory references prior to the write instruction in the CIL instruction sequence.
Also, these blog entries have some details that might be of interest:
Further to Michael Burr's answer, not only does Wait
release and re-acquire the lock, but it does this so that another thread can take out the lock in order to examine the shared state and call Pulse
. If the second thread doesn't take out the lock then Pulse
will throw. If they don't Pulse
the first thread's Wait
won't return. Hence any other thread's access to the shared state must happen within a proper memory-barried scenario.
So assuming the Monitor
methods are being used according to the locally-checkable rules, then all memory accesses happen inside a lock, and hence only the automatic memory barrier support of lock
is relevant/necessary.
Maybe I can help you this time... instead of using a volatile
you can use Interlocked.Exchange
with an integer.
if (closing==1) { // <==== (2) access field here
value = default(T);
return false;
}
// somewhere else in your code:
Interlocked.Exchange(ref closing, 1);
Interlocked.Exchange
is a synchronization mechanism, volatile
isn't... I hope that's worth something (but you probably already thought about this).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With