I have a C# windows service acting as a server, the service holds some large (>8Gb) data structures in memory and exposes search methods to clients via remoting.
The avg search operation is executed in <200ms and the service handles up to 20 request/sec.
I'm noticing some serious performance degradation (>6000ms) on a regular basis for few seconds
My best guess is that the server threads are stopped by a gen2 garbage collection from time to time.
I'm considering switching from server gc to workstation gc and wrap my search method in this to prevent GC during requests.
static protected void DoLowLatencyAction(Action action)
{
GCLatencyMode oldMode = GCSettings.LatencyMode;
try
{
GCSettings.LatencyMode = GCLatencyMode.LowLatency;
// perform time-sensitive actions here
action();
}
finally
{
GCSettings.LatencyMode = oldMode;
}
}
Is this a good idea?
Under what conditions the GC will be performed anyway inside the low latency block?
Note: I'm running on a x64 server with 8 cores
Thanks
I've not used GCLatencyMode
before so cannot comment on whether using it is a good idea or not.
However, are you sure you are using server GC? By default, Windows services use workstation GC.
I've had a similar problem before in a Windows service, and setting server GC mode using:
<configuration>
<runtime>
<gcServer enabled="true" />
</runtime>
</configuration>
in the service's app.config
file solved it.
Have a read of this post from Tess Ferrandez's blog for a more details.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With