I know that the general guideline for implementing the dispose pattern warns against implementing a virtual Dispose()
. However, most often we're only working with managed resources inside a class and so the full dispose pattern seems like an overkill - i.e. we don't need a finalizer. In such cases, is it OK to have a virtual Dispose()
in the base class?
Consider this simple example:
abstract class Base : IDisposable
{
private bool disposed = false;
public SecureString Password { get; set; } // SecureString implements IDisposable.
public virtual void Dispose()
{
if (disposed)
return;
if (Password != null)
Password.Dispose();
disposed = true;
}
}
class Sub : Base
{
private bool disposed = false;
public NetworkStream NStream { get; set; } // NetworkStream implements IDisposable.
public override void Dispose()
{
if (disposed)
return;
if (NStream != null)
NStream.Dispose();
disposed = true;
base.Dispose();
}
}
I find this more readable than a complete dispose pattern. I understand that this example would become problematic if we introduced an unmanaged resource into the Base
class. But suppose that this won't happen. Is the above example then valid/safe/robust, or does it pose any problem? Should I stick to the standard full-blown dispose pattern, even if no unmanaged resources are used? Or is the above approach indeed perfectly OK in this situation - which in my experience is far more common than dealing with unmanaged resources?
I have given up on the full-blown IDisposable
pattern in cases where I am not dealing with unmanaged resources.
I don't see any reason why you shouldn't be able to forego the pattern and make Dispose
virtual if you can ensure that derived classes won't introduce unmanaged resources!
(And here lies a problem, because you cannot enforce this. You have no control over what fields a derived class will add, so there is no absolute safety with a virtual Dispose
. But then even if you used the full-blown pattern, a derived class could get things wrong, so there is always some trust involved that subtypes adhere to certain rules.)
First, in cases where we only deal with managed objects, having a finalizer would be pointless: If Dispose
is called from the finalizer (Dispose(disposing: false)
according to the full-blown pattern), we may not safely access/dereference other reference-typed managed objects because they might already be gone. Only value-typed objects reachable from the object being finalized may be safely accessed.
If the finalizer is pointless, it is also pointless to have the disposing
flag, which is used to distinguish whether Dispose
was explicitly called or whether it was called by the finalizer.
It is also unnecessary to do GC.SuppressFinalize
if there is no finalizer.
I am not even sure whether it is then still imperative that (See @usr's comment below.)Dispose
not throw any exceptions in a managed-only scenario. AFAIK this rule exists mostly to protect the finalizer thread.
As you can see, once the finalizer goes, much of the classic Disposable pattern is no longer necessary.
I understand that this example would become problematic if we introduced an unmanaged resource into the Base class. But suppose that this won't happen. Is the above example then valid/safe/robust, or does it pose any problem? Should I stick to the standard full-blown dispose pattern, even if no unmanaged resources are used?
Though your base class uses exclusively managed resources (and this will not change in the future), you cannot guarantee that a derived class will not use unmanaged resources. So consider to implement the Basic Dispose Pattern in your base class (a class with a virtual Dispose(bool)
but without a finalizer) even if you always call the Dispose(bool)
method with true
.
When some day a new subclass will be derived from your Base
, which uses unmanaged resources, its finalizer might want to call the Dispose(bool)
with false
. Of course, you could introduce a new Dispose(bool)
in the derived class:
public class SubUnmanaged : Base
{
IntPtr someNativeHandle;
IDisposable someManagedResource;
public override sealed void Dispose()
{
base.Dispose();
Dispose(true);
GC.SuppressFinalize(this);
}
~SubUnmanaged();
{
base.Dispose();
Dispose(false);
}
protected virtual void Dispose(bool disposing)
{
if (someNativeHandle != IntPtr.Zero)
Free(someNativeHandle);
if (disposing)
someManagedResource.Dispose();
}
}
But I think this is a little bit nasty. By sealing the original virtual Dispose()
method, you can prevent the further inheritors to be confused because of the multiple virtual Dispose
methods, and you can call the common base.Dispose()
both from the finalizer and the sealed Dispose()
. You must do the same workaround in every subclass, which use unmanaged resources and is derived from Base
or a fully managed subclass. But this is already far from the pattern, which always makes the things difficult.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With