We're experimenting with appending timestamps to some URL's to let things cache but freshen them when they do change. We have code that boils down to this:
DateTime ts = File.GetLastWriteTime(absPath);
where absPath is a MappedPath of a url. So the web server will be checking this file's last write time every time we serve up a link to the file. Kinda gives me the willies - should it?
You should performance-test it, but off-hand I doubt it's any more expensive than testing a file's existence (e.g. whether it's read-only), and certainly less expensive than actually opening the file.
If (after testing) it you decide that it's a problem, you could also cache your calls to GetLastWriteTime
(e.g. don't call it more than once every 5 seconds for any given file).
Also, I've never used it but if caching is a concern I hope you've considered delegating its implementation to some specialist like Squid instead of rolling your own.
I have not tried this, but your question is relevant to a situation that I have been thinking about.
You did not indicate what data is changing? database, xml data etc.
ASP.NET caching does support updating the cache based on a variety of dependencies.
Check out this article in the sections of File-based Dependency, Time-based Dependency, and Key-based Dependency.
"Dependencies allow us to invalidate a particular item within the Cache based on changes to files, changes to other Cache keys, or at a fixed point in time. Let's look at each of these dependencies."
Here is the article:
http://msdn.microsoft.com/en-us/library/ms972379.aspx
Thanks
Joe
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With