Of­fer app ac­cess off­line

Fol­low­ing up on part one of his guide, Daniel Crisp shows you how to use ser­vice work­ers to serve data from the cache when users can’t get on­line

net magazine - - CONTENTS - About the au­thor Daniel Crisp

Daniel Crisp shows you how to serve cached data off­line us­ing ser­vice work­ers

Wel­come back to our two-part ser­vice worker tu­to­rial. In part one, we learned about the ser­vice-worker life cy­cle and set up our bare-bones worker to cache and serve our weighty static as­sets, pro­vid­ing a pretty im­pres­sive per­for­mance im­prove­ment on sub­se­quent vis­its.

In part two, we’re go­ing to fur­ther en­hance our worker to cache the dy­namic API re­sponse, learn about caching strate­gies and give our app full off­line sup­port.

You’re go­ing to need re­cent ver­sions of Node.js and npm in­stalled on your com­puter.

Caching dy­namic re­sponses

OK, as it stands we’re only caching static as­sets, such as im­age and JS li­braries. Ser­vice work­ers en­able us to cache dy­namic re­sponses, such as API re­sponses, but we need to put some thought into it first. And if we want to give our app off­line sup­port, we’ll also need to cache the ‘in­dex.html’ file, too. We’ve got a few op­tions to choose from for our caching strat­egy.

● Cache only

● Net­work only

● Net­work first, fall­ing back to cache

● Cache first, fall­ing back to net­work

● Cache then net­work

Each has its pros and cons. There is an ex­cel­lent Google ar­ti­cle in the Fur­ther Read­ing sec­tion that ex­plains each ap­proach and how to im­ple­ment it.

The code we added above for the static as­sets uses the cache first, then falls back to the net­work ap­proach. We can safely do this be­cause our static as­sets are ‘revved’. We need to de­cide what’s best for our dy­namic API re­sponse, though.

The an­swer de­pends on the data re­turned by the server, how crit­i­cal fresh data is to your users and how fre­quently you call the end­point. If the data is

likely to change fre­quently or if it is crit­i­cal that is it up-to-date then we don’t want to be serv­ing stale data from our cache by de­fault. How­ever if you are go­ing to be polling the end­point ev­ery ten sec­onds, say, then per­haps cache-first is more suit­able and you can up­date the cache in the back­ground in prepa­ra­tion for the next re­quest.

The other con­sid­er­a­tion with caching API re­sponses is user-spe­cific re­sponses. If your app en­ables users to lo­gin then you need to re­mem­ber that mul­ti­ple users may use the same com­puter. You don’t want to be serv­ing a cached user pro­file to a dif­fer­ent user!

In our sce­nario we can as­sume that the re­sponse from the API will be chang­ing fre­quently. For a start it is re­spond­ing with the fore­cast for the next 120 hours (five days), in three hour chunks, mean­ing if we call it again in three hours’ time we will get a dif­fer­ent re­sponse than we get now. And, of course, this is weather fore­cast data so at least here in the UK it will be chang­ing all the time. For that rea­son let’s go for net­work first, then fall back to cache. The user will get the cached re­sponse only if the net­work re­quest fails, per­haps be­cause they are off­line.

Caching the in­dex

This is also a safe ap­proach for our ‘in­dex.html’ file so we’ll in­clude that in the cache, too. Re­mem­ber you don’t want to end up with users stuck on a stale ver­sion of your app (in their cache) be­cause you’ve cached ev­ery­thing too ag­gres­sively. An­other op­tion here is to change the cache name, that is ‘v1-as­sets’ be­comes ‘v2-as­sets’, with each new re­lease but this ap­proach has ad­di­tional over­head be­cause you need to add code to man­u­ally clean up the old caches. For the pur­poses of this tu­to­rial we’ll take the sim­pler op­tion!

The fetch lis­tener

Cur­rently our ex­ist­ing fetch lis­tener looks for a match for a re­quest in all caches but it al­ways fol­lows the cache-first ap­proach. We could mod­ify this to switch modes but we’d end up with an un­wieldy lis­tener. In­stead, just as you can with nor­mal JS, we’ll sim­ply add an­other fetch lis­tener. One will han­dle the as­sets cache and the other will han­dle the dy­namic cache.

We need to in­clude some of the same checks to fil­ter out un­wanted re­quests, then we want to al­low cer­tain re­quests to be cached. Add this new lis­tener be­low your ex­ist­ing fetch lis­tener:

/** * * DY­NAMIC CACHING * */ self. ad­dEven­tLis­tener(‘ fetch’, (event) => { // Ig­nore non- GET re­quests if (event. re­quest. method !== ‘ GET’) {

re­turn; } // Ig­nore browser- sync if (event. re­quest. url. in­dexOf(‘ browser- sync’) > -1) {

re­turn; } let al­low = false;

// Al­low in­dex route to be cached if (event. re­quest. url === ( self. lo­ca­tion. ori­gin + ‘/’)) {

al­low = true; } // Al­low in­dex. html to be cached if (event. re­quest. url. end­sWith(‘ in­dex. html’)) {

al­low = true; } // Al­low API re­quests to be cached if (event. re­quest. url. start­sWith(‘ https://api. open­weath­ermap. org’)) {

al­low = true; } if ( al­low) {

// Dy­namic caching logic go here... } });

Dy­namic cache

We’re go­ing to store these re­sponses in a dif­fer­ent cache, although this isn’t strictly nec­es­sary. We’ll call this one ‘v1-dy­namic’. Add this at the top of the ‘sw.js’ file:

const dy­nam­icCacheName = ‘ v1- dy­namic’;

We don’t need to cre­ate this cache when the worker in­stalls be­cause it only caches re­sponses re­ac­tively – that is, af­ter the browser has made the re­quest. In­stead we can do all the work in the fetch lis­tener. Let’s add the net­work first logic in­side our

if (al­low) state­ment.

// De­tect re­quests to API if (event. re­quest. url. start­sWith(‘ https://api. open­weath­ermap. org’)) { // Net­work first event. re­spondWith( // Open the dy­namic cache caches. open(dy­nam­icCacheName).then((cache) => { // Make the re­quest to the net­work re­turn fetch(event. re­quest) .then((re­sponse) => {

// Cache the re­sponse cache. put(event. re­quest, re­sponse. clone()); // Re­turn the orig­i­nal re­sponse re­turn re­sponse; }); }) ); } else { // ...

This code opens the cache, makes the net­work re­quest, caches the net­work’s re­sponse and then re­turns the re­sponse to the page.

Open up the app. Reload the page to get the lat­est ver­sion of the worker. Now click through un­til you see the re­sult page mean­ing a re­quest has been made to the API.

Once that has hap­pened check in DevTools again and you should see the two caches and the cached API re­sponse and in­dex route in the dy­namic cache.


So we’ve got our cached re­sponse but if we go off­line again you’ll see that the app still fails to load.

Why is this? Well, we’ve not told the worker what to do when the net­work re­quest fails. We can cor­rect this by adding a catch method to the end of the

fetch(event.re­quest) prom­ise chain.

. catch(() => { // On fail­ure look for a match in the cache re­turn caches. match(event. re­quest); });

Now save and try this again in off­line mode. Hope­fully you’ll now see the app work­ing as if it were on­line! Pretty cool.

Manag­ing ex­pec­ta­tions

Right, so we’ve got a fully func­tion­ing of­flineca­pable app. But it isn’t go­ing to mag­i­cally work all the time. The data we get from the API is time­sen­si­tive so we could end up in a sit­u­a­tion where the cached re­sponse is served up but it is out of date and none of the data is rel­e­vant. It’s worth not­ing that cached data doesn’t ex­pire au­to­mat­i­cally – it has to be man­u­ally re­moved or re­placed – so we can’t set an ex­piry date like we can with a cookie.

The ques­tion our app asks is ‘Will it rain to­day?’, yet we get five days’ worth of data in the API’s re­sponse so, in the­ory, the cached ver­sion will be valid for five days even though the fore­cast will be­come less ac­cu­rate as times goes by.

We should con­sider these two sce­nar­ios to man­age the user’s ex­pec­ta­tions: ● User is off­line and has been served an old, al­most out-of-date cache. ● User is off­line and the cached data is out-of-date.

We can de­tect the user’s net­work sta­tus in the page but on a mo­bile, non-WiFi con­nec­tion it’s pos­si­ble that con­nec­tion was lost mo­men­tar­ily just as the API re­quest was be­ing made. So rather than dis­play­ing a ‘You are off­line’ mes­sage for a brief flicker it would be bet­ter to de­ter­mine that the re­sponse re­ceived by the page is from the cache rather than the net­work.

For­tu­nately, be­cause our data al­ready con­tains date/time in­for­ma­tion, we can de­ter­mine if the data is from the cache by check­ing if the first date is in the past. If this wasn’t the case we’d prob­a­bly be able to mod­ify the body of the re­sponse in the worker be­fore caching it to in­clude a time­stamp.

On­line dat­ing

Time to open up the app’s ‘main.js’ file. On line 172 you’ll see that we are al­ready cre­at­ing an ar­ray called in­DateItems that fil­ters the full ar­ray so that it only con­tains fore­cast items for to­day’s date. Then be­low this we check if the ar­ray has any items. If it is empty we show an er­ror mes­sage to the user in­form­ing them that the data is out-of-date, so this al­ready cov­ers one of the sce­nar­ios. But what about when the data is old but not fully out of date?

We could do this by check­ing the date of the first item in the ar­ray and com­par­ing it to now to see if it ex­ceeds a cer­tain thresh­old. You can add these con­stants just in­side the in­DateItems.length check:

// ... // En­sure we ac­tu­ally have rel­e­vant data points if ( in­DateItems. length) { // De­fine a thresh­old amount. This is one day in mil­lisec­onds const staleThresh­old = 24 * 60 * 60 * 1000; // Check dif­fer­ence be­tween first data point and now const dateDiff = Date. now() - in­DateItems[0]._date; // Does the date ex­ceed the thresh­old? const dataIsS­tale = dateDiff > staleThresh­old; // ...

Now we have a Boolean to flag if our data is stale or not, but what should we do with it? Well, here’s a lit­tle some­thing we made ear­lier… add this be­low the lines you’ve just added:

// Show the mes­sage if data is stale if (dataIsS­tale) {

showS­tale(); }

This pre-pre­pared method will dis­play a mes­sage to the user that tells them the data is stale – It’s not easy to sim­u­late stale data so call showS­tale() with­out the dataIsS­tale check to man­u­ally show the UI. In ad­di­tion it pro­vides a but­ton which will al­low them to re­fresh the data and a warn­ing mes­sage if they are cur­rently off­line. When off­line, the but­ton is dis­abled.

This is eas­ily achieved by lis­ten­ing to the on­line and off­line events that are emit­ted on the win­dow, but we also need to check the ini­tial state be­cause the events are only emit­ted when the sta­tus changes. Our new ser­vice worker al­lows the page to be loaded even when there is no con­nec­tion so we also can’t as­sume we have a con­nec­tion when the page ren­ders. Check the code in ‘main.js’ to see how this is im­ple­mented.

Jump­ing through hoops

Now this is when devel­op­ment starts to get tricky. We’re mak­ing changes to files that are cached by our ser­vice worker but be­cause we’re in dev mode the file names aren’t revved. Changes to the worker it­self are au­to­mat­i­cally picked up and han­dled be­cause we ticked the ‘Up­date on reload’ op­tion in DevTools but the cached as­sets aren’t reloaded be­cause we’re us­ing a cache-first ap­proach – mean­ing we don’t get to see our changes to the app’s code.

Once again DevTools comes to the res­cue. Next to the ‘Up­date on reload’ op­tion is an op­tion called ‘By­pass for net­work’. This slightly ob­scure name doesn’t make it ob­vi­ous (at least not to me!) what it ac­tu­ally does. But if you tick this op­tion then all re­quests will come from the net­work, rather than the ser­vice worker.

It doesn’t dis­able the worker en­tirely so you will still be able to in­stall and ac­ti­vate the worker but you can be sure that ev­ery­thing comes from the net­work.

Re­mov­ing the stale cache

So we know we’ve got a stale re­sponse in the cache but how do we rec­tify this? In this sce­nario we don’t re­ally need to do any­thing be­cause once the user has re­con­nected to the in­ter­net they can run the re­quest again and the cache will be up­dated in the back­ground – just one of the ben­e­fits of a net­work­first ap­proach.

How­ever, for the pur­poses of this tu­to­rial, we wanted to demon­strate how you can clean up stale items in your cache.

As it stands the worker is ma­nip­u­lat­ing the cache but only the page is aware that the data is out of date. How can the page tell the worker to up­date the cache? Well, we don’t have to. The page can ac­cess the cache di­rectly it­self.

There is a click han­dler for the re­fresh data but­ton ($ bt­nS­tale) ready to be pop­u­lated on line 395 (ap­prox).

Just as in the worker, we need to open the cache us­ing its name first. We named our API cache

v1-dy­namic so we have to use the same name here. Once open we can re­quest that the cache deletes the item match­ing the re­quest URL. Add the fol­low­ing in­side the click han­dler to do the magic:

// Reload the data $ bt­nS­tale. on(‘click’, () => { caches. open(‘ v1- dy­namic’).then((cache) => { // Get the API url const url = getUrl( latLng); // Delete the cache cache. delete(url).then(() => { // Re-fetch the re­sult fetchRe­sult(); }); }); });

In pro­duc­tion you’d need to check the browser has sup­port for the cache API be­fore im­ple­ment­ing this.


Done. In part one of this tu­to­rial, we re­duced our sub­se­quent load time from around 30 sec­onds to less than one se­cond. Now in part two we’ve made the app fully off­line com­pat­i­ble.

Hope­fully, that’ll give you a good ground­ing in how to set up a sim­ple ser­vice worker and show you some of the things you need to be aware of. You can most def­i­nitely use this in pro­duc­tion to­day. Good luck!

w: job: daniel­crisp.comSe­nior front- end de­vel­oper – con­tract/ free­lance ar­eas of ex­per­tise: JavaScript ( An­gu­lar), HTML and CSS

Above Once a re­quest has been made to the API, check in DevTools again and you should see the two caches, with the cached API re­sponse and in­dex route in the dy­namic cache

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.