Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Disabling ASP.NET HttpHandler response caching

Background

I'm in the midst of comparing the performance of NancyFx and ServiceStack.NET running under IIS 7 (testing on a Windows 7 host). Both are insanely fast - testing locally each framework processes over 10,000+ req/sec, with ServiceStack being about 20% faster.

The problem I'm running into is that ASP.NET appears to be caching the responses for each unique URI request from the HttpHandler, quickly leading to massive memory pressure (3+ GBs) and overworking the garbage collector (~25% time consumed by the GC). So far I've been unable to disable the caching and buildup of objects, and am looking for suggestions on how to disable this behavior.

Details

The request loop is basically as follows:

for i = 1..100000:
    string uri = http://localhost/users/{i}
    Http.Get(uri)

The response is a simple JSON object, formatted as { UserID: n }.

I've cracked open WinDBG, and for each request there are:

  • One System.Web.FileChangeEventHandler
  • Two System.Web.Configuration.MapPathCacheInfos
  • Two System.Web.CachedPathDatas
  • Three System.Web.Caching.CacheDependencys
  • Five System.Web.Caching.CacheEntrys

Obviously, these cache items are what is leading me to believe it's a cache bloat issue (I'd love to get rid of 150,000 unusable objects!).

What I've tried so far

  • In IIS 'HTTP Resonse Headers', set 'Expire Web content' to 'immediately'.
  • In the web.config

    <system.web>
     <caching>
        <outputCache enableOutputCache="false" enableFragmentCache="false"/>
      </caching>
    </system.web>
    
  • Also in the web.config (and many variations on the policies, including none).

    <caching enabled="false" enableKernelCache="false">
      <profiles>
        <add policy="DontCache" kernelCachePolicy="DontCache" extension="*/>
      </profiles>
    </caching>
    
  • Looked through the source code of the frameworks to see if there might be any "features" built in that would use ASP.NET caching. While there are caching helpers, they are private to the framework itself and do not appear to leverage ASP.NET caching.

Update #1

Digging through reflector I've found that setting the value for UrlMetadataSlidingExpiration to zero eliminates a large portion of the excessive memory usage, at the expense of cutting throughput by 50% (the FileAuthorizationModule class caches the FileSecurityDescriptors, which must be somewhat expensive to generate, when UrlMetadataSlidingExpiration is non-zero).

This is done by updating the web.config and placing the following in :

<hostingEnvironment urlMetadataSlidingExpiration="00:00:00"/>

I'm going to try to fully disable the FileAuthorizationModule from running, if possible, to see if that helps. However, ASP.NET is still generating 2*N MapPathCacheInfo and CacheEntry objects, so memory is still getting consumed, just at much slower rate.

Update #2

The other half of the problem is the same issue as described here: Prevent many different MVC URLs from filling ASP.NET Cache. Setting

<cache percentagePhysicalMemoryUsedLimit="1" privateBytesPollTime="00:00:01"/> helps, but even with these very aggressive settings memory usage quickly rises to 2.5GB (compared to 4GB). Ideally these objects would never be created in the first place. Failing that, I may resort to a hacky solution of using reflection to clear out the Caches (all these entries are "private" and are not enumerated when iterating over the public Cache).

like image 548
Kevin Pullin Avatar asked Jul 28 '11 18:07

Kevin Pullin


3 Answers

A late response for others that suffers the same problem:

This is a known issue: KB 2504047

This issue occurs because the unique requests that try to access the same resources are cached as MapPathCacheInfo objects for 10 minutes.

While the objects are cached for 10 minutes, the memory consumption of the W3wp.exe process increases significantly.

You can download the hotfix here

like image 127
lstern Avatar answered Oct 22 '22 14:10

lstern


I think this is less of an issue of caching, and more of an issue of "high memory utilization".

Two things,

If you use an IDisposable friendly object (try one that can use the "using" keyword). This will allow you to dispose of the object sooner than later, creating less pressure for the garbage collector in the long run.

for (int i = 0; i < 10000; i++) {
    using (System.Net.WebClient c = new System.Net.WebClient()) {
        System.IO.Stream stream = c.OpenRead(String.Format("http://url.com/{0}", i));

    }
}

From your pseudo-code, I can only assume that you're using System.Net.WebHttpRequest which isn't disposable and will probably hang around longer than it should if you are making successive calls.

Secondly, if you are making successive calls to an external server, I'd put delays between each calls. This will give some breathing room as your processor will process the for-loop much faster than the network will have time to respond (and will just keep queue up the requests and slowing down the speed of ones actually being processed).

System.Threading.Thread.Sleep(250);

Obviously, the best solution would be to make a single call with a list of the users you want to retreive and just deal with one webrequest/response.

like image 23
sonjz Avatar answered Oct 22 '22 13:10

sonjz


Ensure the IsReusable property is set to false so IIS doesn't reuse the same request process to handle the request.

http://support.microsoft.com/kb/308001

like image 26
Frazell Thomas Avatar answered Oct 22 '22 14:10

Frazell Thomas