I'm aware that the best practice is to call Dispose on any object that implements IDisposable, especially objects that wrap finite resources like file handles, sockets, GDI handles, etc.
But I'm running into a case where I have an object that has a Font, and I would have to plumb IDisposable through several layers of objects, and review a lot of usages, to make sure I always get the Font disposed. And I'm wondering whether it's worth the complexity.
It would be one thing if Font wrapped an HFONT, because GDI resources are system-global. But Font doesn't wrap a GDI handle; it's GDI+, which is a completely separate system, and as far as I understand, is process-local, not system-global like GDI. And unlike Image, Font doesn't ever hold onto filesystem resources (that I know of, anyway).
So my question is: What is the real cost of letting a Font get garbage collected?
I know I would take a small hit for the finalizer, but if the number of "leaked" Fonts is small (say half a dozen), that hit honestly wouldn't be noticeable. Apart from the finalizer, this doesn't seem much different from allocating a mid-sized array and letting the GC clean it up -- it's just memory.
Are there costs I'm not aware of in letting a Font get GCed?
Simple answer: if its just a few, then no. If it's a lot, then yes. If your application is already stressing the garbage collector, then yes. I would use perfmon to view the number of objects sitting around, and the number getting promoted to higher generations, and then decide.
The problem is that garbage collection only happens when there is memory pressure. Often, unmanaged handles are more restricted than memory, so you can run out of handles before GC happens, causing errors.
But for one or two Font
instances, it won't hurt you overly.
A bigger problem is that some of the objects are shared and shouldn't (or can't ) be disposed prematurely...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With