I'm retreiving images from a web server directory like this:
WebClient webClientImgDownloader = new WebClient();
webClientImgDownloader.OpenReadCompleted += new OpenReadCompletedEventHandler(webClientImgDownloader_OpenReadCompleted);
if(uriIndex < uris.Count())
webClientImgDownloader.OpenReadAsync(new Uri(uris[uriIndex], UriKind.Absolute));
But I've noticed if I remove the image, silverlight continues to retrieve the image as if it were there.
When I then type the image URL into FireFox I see the image as well, but then I click Reload
and it gives me the appropriate error that the image doesn't exist. Then when I run my silverlight application again, it also appropriately gives me an error that the image doesn't exist as if the browser had cleared a cache flag somewhere.
How can I then do a "refresh" via WebClient in code, so that if an image suddenly doesn't exist on the server, Silverlight doesn't continue to give me a cached copy of it?
The basic idea behind it is the following: The browser requests some content from the web server. If the content is not in the browser cache then it is retrieved directly from the web server. If the content was previously cached, the browser bypasses the server and loads the content directly from its cache.
Web caching works by caching the HTTP responses for requests according to certain rules. Subsequent requests for cached content can then be fulfilled from a cache closer to the user instead of sending the request all the way back to the web server.
Caching is a mechanism to improve the performance of any type of application. Technically, caching is the process of storing and accessing data from a cache. But wait, what is a cache? A cache is a software or hardware component aimed at storing data so that future requests for the same data can be served faster.
A cache server is a dedicated network server or service acting as a server that saves Web pages or other Internet content locally. By placing previously requested information in temporary storage, or cache, a cache server both speeds up access to data and reduces demand on an enterprise's bandwidth.
This is a tricky one as the caching is usually being caused by the website's headers not specifying a no-cache. I've found that in the past the easiest way to deal with these caching issues is simply to provide a randomised query string parameters so that the web server interprets each request as a fresh request.
if you're currently requesting www.domain.com/image.jpg then try www.domain.com/image.jpg?rand=XXXX where XXXX is a random value generated in your server side code.
You need to decide what you're caching policy is for various content on your site.
If you must make sure that the latest state is presented when ever a request is made ensure that the server configures the response headers appropiately. In this case make sure you have the header Cache-Control: max-age=0
specified on the image (or more likely on the folder holding the set of images).
By setting max-age=0 you will cause the browser to attempt to refetch the image, however it will inform the server about any existing version of the image it has in the cache. This gives the server the opportunity to send status 404 because the image has been deleted, 304 because the image is still there and hasn't changed so the cached version may be used or 200 because the image has changed, this latter response will carry a the new version.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With