BACKGROUND:
In running my app through a profiler, it looks like the hotspots are all involved in allocating a large number of temporary new byte[] arrays.
In one run under CLR Profiler, a few short (3-5 seconds worth of CPU time outside the profiler) produced over a gigabyte of garbage, the majority of it byte[] allocation, and this triggered over 500 collections.
In some cases it appears that the application is spending upwards of 10% of its CPU time performing collections.
Clearly a rewrite is in order.
So, I am thinking of replacing the new byte[] allocations with a pool class that could reuse the buffer at a later time.
Something like this ...
{
byte[] temp = Pool.AllocateBuffer(1024);
...
}
QUESTION:
How can I force the application to call code in the routine Pool.deAllocate(temp) when temp is no longer needed.
In the above code fragment, when temp is a Pool allocated byte[] buffer, but when it goes out of scope it gets deleted. Not a real problem, but doesn't get reused by the pool.
I know I could replace the "return 0;" with "Pool.deAllocate(temp); return 0", but I'm trying to force the recovery to occur.
Is this even remotely possible?
You could implement a Buffer
class which implements IDisposable
and returns the buffer to the pool when it's disposed. You can then give access to the underlying byte array, and so long as everyone plays nicely you can take advantage of reuse.
Be warned though:
I actually have some code in MiscUtil to do this - see CachingBufferManager
, CachedBuffer
etc. I can't say I've used it much, mind you... and from what I remember, I made it a bit more complicated than I really needed to...
EDIT: To respond to the comments...
using
statement is the closest we've got.byte[]
in your buffer class to allow you to call methods which have byte array parameters. Personally I'm not much of a fan of implicit conversions, but it's certainly available as an option.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With