I have just attempted to implement service workers to cache some JSON files and other assets on a static site (running on localhost chrome Version 47.0.2526.73 (64-bit)). Using cache.addAll() I have added my files to the cache, and when I open the resources tab in chrome, and click on cache storage, all the files are listed.
The issue I am having is that my service worker is listed as "activated" and "running" in chrome://service-worker-internals however, I cannot determine if the worker is actually intercepting the requests and serving up the cached files. I have added the event listener and even when I console log the event in the service workers dev tools instance, it never hits the break point:
this.addEventListener('install', function(event) {
event.waitUntil(
caches.open('v1').then(function(cache) {
console.log(cache);
return cache.addAll([
'/json/0.json',
'/json/1.json',
'/json/3.json',
'/json/4.json',
'/json/5.json',
]);
})
);
});
this.addEventListener('fetch', function(event) {
console.log(event);
var response;
event.respondWith(caches.match(event.request).catch(function() {
return fetch(event.request);
}).then(function(r) {
response = r;
caches.open('v1').then(function(cache) {
cache.put(event.request, response);
});
return response.clone();
}).catch(function() {
}));
});
Basically I am running through things exactly as described in HTML5 rocks service workers intro, but I am pretty sure that my assets aren't being served from the cache. I've noted that assets served up from a service worker are noted as such in the network tab of devtools in the size column by indicating 'from service workers'.
It just seems as if my code is no different than the examples but it's not ever hitting the fetch event for some reason. Gist of my code: https://gist.github.com/srhise/c2099b347f68b958884d
Using fetch in Service Workers The main event that you use in a Service Worker is the fetch event. The fetch event runs every time the browser attempts to access content within the scope of the Service Worker. self.
Service workers require browsers support the Cache API, which is available to both the service worker and the main or UI thread to access. The Cache API provides the methods you can programmatically use to manage how network responses are cached.
Don't worry about a service worker caching itself If a service worker is set to intercept and cache all requests, one might worry that an older service worker could end up serving a cached version of itself rather serving and registering a new one.
After looking at your gist and your question, I think your issue is with scoping.
From what I've determined with service workers(at least with static files), the service worker only has a maximum scope of the directory it is in. That is to say, it can't load files/requests/responses that are pulled from a location at or above its structure, only below.
For example, /js/service-worker.js will only be able to load files in /js/{dirName}/.
Therefore, if you change the location of your service worker to the root of your web project, the fetch event should fire and your assets should load from cache.
So something like /service-worker.js, which should be able to access the /json directory, since it is deeper than the service-worker.js file.
This is further explained here, in the "Register A service worker" section. https://developers.google.com/web/fundamentals/getting-started/primers/service-workers
I struggled with this for a long time, and I think the documentation related to the matter is seriously lacking. In my experience, there is a very important distinction:
The service worker can only intercept fetch events if it is in or above the scope of the URL it is accessed from.
For example, my sw.js file was located at /static/sw.js
. When accessing my site's root at /
and attempting to intercept fetch events to js files in /static/js/common.js
, the fetch events were not intercepted, even though the scope of my service worker was /static/
and the js file was in /static/js/
.
Once I moved my sw.js file to the top-level scope /sw.js
, the fetch events were all intercepted. This is because the scope of the page I was accessing with my browser /
was the same as the scope of my sw.js file /
.
Please let me know if this clears things up for people, or if I am incorrect!
The exact code in the HTML5Rocks article is
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function(response) {
// Check if we received a valid response
if(!response || response.status !== 200 || response.type !== 'basic') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have 2 stream.
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
cache.put(event.request, responseToCache);
});
return response;
}
);
})
);
});
The biggest thing that I can see is that you are not cloning the request
from the fetch, you need to clone it because it is read twice, once when being used to access the network (in the fetch
) and once when being used as the key to the cache.
If you don't see the mark from service worker
then your page is not controlled by the service worker yet (you can check by inspecting navigator.serviceWorker.controller
in the client page). The default behaviour for a page is to become controlled the next time you visit it after SW activation so you have two options:
self.skipWaiting()
and self.clients.claim()
methods during installation and activation respectively to force the service worker to take control of the clients ASAP.Look at the Service Worker Cookbook, it includes an JSON cache recipe.
Once you fix the problem with control, you need to fix your handler. I think you want to apply a policy of cache-first. If not in cache, then go to network and fill the cache. To do that:
self.onfetch = function (event) {
var req = event.request;
return event.respondWith(function cacheFirst() {
// Open your cache.
return self.caches.open('v1').then(function (cache) {
// Check if the request is in there.
return cache.match(req).then(function (res) {
// If not match, there is no rejection but an undefined response.
if (!res) {
// Go to network.
return fetch(req.clone()).then(function (res) {
// Put in cache and return the network response.
return cache.put(req, res.clone()).then(function () {
return res;
});
});
}
return res;
});
});
});
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With