Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I force a hard refresh (ctrl+F5)?

We are actively developing a website using .Net and MVC and our testers are having fits trying to get the latest stuff to test. Every time we modify the style sheet or external javascript files, testers need to do a hard refresh (ctrl+F5 in IE) in order to see the latest stuff.

Is it possible for me to force their browsers to get the latest version of these files instead of them relying on their cached versions? We're not doing any kind of special caching from IIS or anything.

Once this goes into production, it will be hard to tell clients that they need to hard refresh in order to see the latest changes.

Thanks!

like image 568
Chris Conway Avatar asked Jun 01 '09 20:06

Chris Conway


People also ask

How do I force a Windows refresh?

Hold the Ctrl (Control) key, press the F5 key. Or hold the Ctrl (Control) key, click the Refresh button.

Is F5 hard refresh?

Chrome also offers the reload shortcut combinations of “Ctrl + F5” and “Ctrl + Shift + R” to reload the currently open page and override the locally cached version. F5 refreshes the page you are currently on.


7 Answers

I came up against this too and found what I consider to be a very satisfying solution.

Note that using query parameters .../foo.js?v=1 supposedly means that the file will apparently not be cached by some proxy servers. It's better to modify the path directly.

We need the browser to force a reload when the content changes. So, in the code I wrote, the path includes an MD5 hash of the file being referenced. If the file is republished to the web server but has the same content, then its URL is identical. What's more, it's safe to use an infinite expiry for caching too, as the content of that URL will never change.

This hash is calculated at runtime (and cached in memory for performance), so there's no need to modify your build process. In fact, since adding this code to my site, I haven't had to give it much thought.

You can see it in action at this site: Dive Seven - Online Dive Logging for Scuba Divers

In CSHTML/ASPX files

<head>   @Html.CssImportContent("~/Content/Styles/site.css");   @Html.ScriptImportContent("~/Content/Styles/site.js"); </head> <img src="@Url.ImageContent("~/Content/Images/site.png")" /> 

This generates markup resembling:

<head>   <link rel="stylesheet" type="text/css"         href="/c/e2b2c827e84b676fa90a8ae88702aa5c" />   <script src="/c/240858026520292265e0834e5484b703"></script> </head> <img src="/c/4342b8790623f4bfeece676b8fe867a9" /> 

In Global.asax.cs

We need to create a route to serve the content at this path:

routes.MapRoute(     "ContentHash",     "c/{hash}",     new { controller = "Content", action = "Get" },     new { hash = @"^[0-9a-zA-Z]+$" } // constraint     ); 

ContentController

This class is quite long. The crux of it is simple, but it turns out that you need to watch for changes to the file system in order to force recalculation of cached file hashes. I publish my site via FTP and, for example, the bin folder is replaced before the Content folder. Anyone (human or spider) that requests the site during that period will cause the old hash to be updated.

The code looks much more complex than it is due to read/write locking.

public sealed class ContentController : Controller {     #region Hash calculation, caching and invalidation on file change      private static readonly Dictionary<string, string> _hashByContentUrl = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);     private static readonly Dictionary<string, ContentData> _dataByHash = new Dictionary<string, ContentData>(StringComparer.Ordinal);     private static readonly ReaderWriterLockSlim _lock = new ReaderWriterLockSlim(LockRecursionPolicy.NoRecursion);     private static readonly object _watcherLock = new object();     private static FileSystemWatcher _watcher;      internal static string ContentHashUrl(string contentUrl, string contentType, HttpContextBase httpContext, UrlHelper urlHelper)     {         EnsureWatching(httpContext);          _lock.EnterUpgradeableReadLock();         try         {             string hash;             if (!_hashByContentUrl.TryGetValue(contentUrl, out hash))             {                 var contentPath = httpContext.Server.MapPath(contentUrl);                  // Calculate and combine the hash of both file content and path                 byte[] contentHash;                 byte[] urlHash;                 using (var hashAlgorithm = MD5.Create())                 {                     using (var fileStream = System.IO.File.Open(contentPath, FileMode.Open, FileAccess.Read, FileShare.Read))                         contentHash = hashAlgorithm.ComputeHash(fileStream);                     urlHash = hashAlgorithm.ComputeHash(Encoding.ASCII.GetBytes(contentPath));                 }                 var sb = new StringBuilder(32);                 for (var i = 0; i < contentHash.Length; i++)                     sb.Append((contentHash[i] ^ urlHash[i]).ToString("x2"));                 hash = sb.ToString();                  _lock.EnterWriteLock();                 try                 {                     _hashByContentUrl[contentUrl] = hash;                     _dataByHash[hash] = new ContentData { ContentUrl = contentUrl, ContentType = contentType };                 }                 finally                 {                     _lock.ExitWriteLock();                 }             }              return urlHelper.Action("Get", "Content", new { hash });         }         finally         {             _lock.ExitUpgradeableReadLock();         }     }      private static void EnsureWatching(HttpContextBase httpContext)     {         if (_watcher != null)             return;          lock (_watcherLock)         {             if (_watcher != null)                 return;              var contentRoot = httpContext.Server.MapPath("/");             _watcher = new FileSystemWatcher(contentRoot) { IncludeSubdirectories = true, EnableRaisingEvents = true };             var handler = (FileSystemEventHandler)delegate(object sender, FileSystemEventArgs e)             {                 // TODO would be nice to have an inverse function to MapPath.  does it exist?                 var changedContentUrl = "~" + e.FullPath.Substring(contentRoot.Length - 1).Replace("\\", "/");                 _lock.EnterWriteLock();                 try                 {                     // if there is a stored hash for the file that changed, remove it                     string oldHash;                     if (_hashByContentUrl.TryGetValue(changedContentUrl, out oldHash))                     {                         _dataByHash.Remove(oldHash);                         _hashByContentUrl.Remove(changedContentUrl);                     }                 }                 finally                 {                     _lock.ExitWriteLock();                 }             };             _watcher.Changed += handler;             _watcher.Deleted += handler;         }     }      private sealed class ContentData     {         public string ContentUrl { get; set; }         public string ContentType { get; set; }     }      #endregion      public ActionResult Get(string hash)     {         _lock.EnterReadLock();         try         {             // set a very long expiry time             Response.Cache.SetExpires(DateTime.Now.AddYears(1));             Response.Cache.SetCacheability(HttpCacheability.Public);              // look up the resource that this hash applies to and serve it             ContentData data;             if (_dataByHash.TryGetValue(hash, out data))                 return new FilePathResult(data.ContentUrl, data.ContentType);              // TODO replace this with however you handle 404 errors on your site             throw new Exception("Resource not found.");         }         finally         {             _lock.ExitReadLock();         }     } } 

Helper Methods

You can remove the attributes if you don't use ReSharper.

public static class ContentHelpers {     [Pure]     public static MvcHtmlString ScriptImportContent(this HtmlHelper htmlHelper, [NotNull, PathReference] string contentPath, [CanBeNull, PathReference] string minimisedContentPath = null)     {         if (contentPath == null)             throw new ArgumentNullException("contentPath"); #if DEBUG         var path = contentPath; #else         var path = minimisedContentPath ?? contentPath; #endif          var url = ContentController.ContentHashUrl(contentPath, "text/javascript", htmlHelper.ViewContext.HttpContext, new UrlHelper(htmlHelper.ViewContext.RequestContext));         return new MvcHtmlString(string.Format(@"<script src=""{0}""></script>", url));     }      [Pure]     public static MvcHtmlString CssImportContent(this HtmlHelper htmlHelper, [NotNull, PathReference] string contentPath)     {         // TODO optional 'media' param? as enum?         if (contentPath == null)             throw new ArgumentNullException("contentPath");          var url = ContentController.ContentHashUrl(contentPath, "text/css", htmlHelper.ViewContext.HttpContext, new UrlHelper(htmlHelper.ViewContext.RequestContext));         return new MvcHtmlString(String.Format(@"<link rel=""stylesheet"" type=""text/css"" href=""{0}"" />", url));     }      [Pure]     public static string ImageContent(this UrlHelper urlHelper, [NotNull, PathReference] string contentPath)     {         if (contentPath == null)             throw new ArgumentNullException("contentPath");         string mime;         if (contentPath.EndsWith(".png", StringComparison.OrdinalIgnoreCase))             mime = "image/png";         else if (contentPath.EndsWith(".jpg", StringComparison.OrdinalIgnoreCase) || contentPath.EndsWith(".jpeg", StringComparison.OrdinalIgnoreCase))             mime = "image/jpeg";         else if (contentPath.EndsWith(".gif", StringComparison.OrdinalIgnoreCase))             mime = "image/gif";         else             throw new NotSupportedException("Unexpected image extension.  Please add code to support it: " + contentPath);         return ContentController.ContentHashUrl(contentPath, mime, urlHelper.RequestContext.HttpContext, urlHelper);     } } 

Feedback appreciated!

like image 74
Drew Noakes Avatar answered Sep 22 '22 21:09

Drew Noakes


You need to modify the names of the external files you refer to. For e.g. add the build number at the end of each file, like style-1423.css and make the numbering a part of your build automation so that the files and the references are deployed with a unique name each time.

like image 44
Serhat Ozgel Avatar answered Sep 20 '22 21:09

Serhat Ozgel


Rather than a build number or random number, append the last-modified date of the file to the URL as querystring programmatically. This will prevent any accidents where you forget to modify the querystring manually, and will allow the browser to cache the file when it has not changed.

Example output could look like this:

<script src="../../Scripts/site.js?v=20090503114351" type="text/javascript"></script>
like image 44
D'Arcy Rittich Avatar answered Sep 22 '22 21:09

D'Arcy Rittich


Since you mention only your testers complaining, Have you considered having them turn off their local browser cache, so that it checks every time for new content? It will slow their browsers a touch... but unless you are doing usability testing every time, this is probably a whole lot easier than postfixing the filename, adding a querystring param, or modifying the headers.

This works in 90% of the cases in our test environments.

like image 32
Chad Ruppert Avatar answered Sep 22 '22 21:09

Chad Ruppert


What you might do is to call your JS file with a random string each time the page refresh. This way you are sure it's always fresh.

You just need to call it this way "/path/to/your/file.js?<random-number>"

Example: jquery-min-1.2.6.js?234266

like image 43
Erick Avatar answered Sep 20 '22 21:09

Erick


In your references to CSS and Javascript files, append a version query string. Bump it everytime you update the file. This will be ignored by the web site, but web browsers will treat it as a new resource and re-load it.

For example:

<link href="../../Themes/Plain/style.css?v=1" rel="stylesheet" type="text/css" />
<script src="../../Scripts/site.js?v=1" type="text/javascript"></script>
like image 45
DSO Avatar answered Sep 22 '22 21:09

DSO


you could edit the http headers of the files to force the browsers to revalidate on each request

like image 36
Ozzy Avatar answered Sep 20 '22 21:09

Ozzy