When using the profiler in Visual Studio to track down expensive functions, I've seen on occasion that most of the work ends up being in [clr.dll]. That basically amounts to a black box, and I'm wondering if there's a way to track down why it's spending so much time there.
I assume that clr.dll handles stuff like JIT compiling, loading assemblies and managing appdomains, garbage collection, reflection, etc. But it makes it really difficult to actually tell what code is causing it to spend so much time.
Obviously it's some other code besides the runtime itself that is causing it to spend that much time in clr.dll, so how do you track down what code is at fault?
You need to know which part of your code - the code you can edit and compile, which is the only code you can fix - which part of that code is responsible for a substantial percent of time being used.
It does no good to know that clr.dll is using a lot of time unless you can tell which part of your code is responsible for it.
That information is in the call stack.
If you have a method, or even a single line of code, that is on the stack for some percent of time, such as 20%, then it is responsible for roughly that percent of time. If you could somehow eliminate that line of code (or make it take a lot less time) that 20% of the total time would become zero, or nearly so, giving you a speedup factor of 1.0/0.8 = 1.25 or 25%
So how do you find such lines? This is the method I use. No one claims it is pretty, unless the total results are appreciated. If it is applied repeatedly, large speedup factors are possible.
Based on my experience it's probably in the GC. If you use LINQ, it's almost certainly in the GC. I recommend CLRProfiler to track down Gen 0 spam.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With