Restor­ing trust in tech

Peter Bihr ex­plains the need to ap­ply the prin­ci­ples of the open web to re­store con­fi­dence in IoT and AI

net magazine - - CONTENTS - Il­lus­tra­tion by Kym Win­ters

Peter Bihr ex­plains the need to re­store con­fi­dence in IoT and AI

In the early days of the in­ter­net, the web was largely open and de­cen­tralised. But in the roughly 25 years since, the con­sumer in­ter­net has changed: to­day, the web is more con­sol­i­dated and cen­tralised in its struc­tures than ever be­fore. Net­work ef­fects tend to favour a win­ner­takes-all dy­namic and so we have, by and large, one re­ally big search en­gine, one re­ally big so­cial net­work and one re­ally big ecom­merce site.

But con­sol­i­da­tion isn’t the only thing that has changed. Over time, se­cu­rity and pri­vacy safe­guards have been added, like end-to-end en­cryp­tion for web traf­fic (although less so for email). These safe­guards have been tacked onto ex­ist­ing struc­tures and amended to stan­dards. They hadn’t been part of the in­ter­net’s orig­i­nal de­sign; they sim­ply weren’t nec­es­sary in the web’s orig­i­nal, aca­dem­i­cally fo­cused ecosys­tem.

Emerg­ing tech is all around us

For emerg­ing tech to­day, es­pe­cially the in­ter­net of things (IoT) and ar­ti­fi­cial in­tel­li­gence (AI), it’s very dif­fer­ent. We are now cre­at­ing a data layer that ex­tends to, and shapes, our phys­i­cal en­vi­ron­ments.

In this con­text, open­ness and safe­guards for se­cu­rity and pri­vacy are es­sen­tial. We now ca­su­ally em­bed in­ter­net-con­nected mi­cro­phones and cam­eras in liv­ing rooms and be­d­rooms. This dif­fer­ent con­text re­quires dif­fer­ent think­ing. We need to be able to trust the tech­nol­ogy we live with.

To think this through, con­sider three dif­fer­ent con­texts: the smart home, the smart city and al­go­rith­mic de­ci­sion-mak­ing (AKA ar­ti­fi­cial in­tel­li­gence or AI).

Let’s first look at IoT in the smart home. Voice as­sis­tants have mi­cro­phones that by def­i­ni­tion are al­ways lis­ten­ing (to a de­gree) or at the very least could be. In po­lit­i­cal science, the po­ten­tial or threat of abuse is con­sid­ered just about as bad as the real thing be­cause it can lead to chill­ing ef­fects ( en.wikipedia.org/wiki/Chilling_­ef­fect) – if some­one feels like they might be spied on, they change their be­hav­iour. How is this rel­e­vant to how we de­sign con­nected prod­ucts? As we add more and more mi­cro­phones (and other sen­sors) to our phys­i­cal en­vi­ron­ment, we mul­ti­ply the po­ten­tial for abuse. If we want folks to use con­nected prod­ucts, we need to en­sure they know they can trust them. Oth­er­wise the pri­vacy of our homes is a thing of the past.

Now zoom out of the home and onto the city: when smart-city tech­nol­ogy with all its sen­sors and al­go­rithms is rolled out across the ur­ban fab­ric, it ap­plies to every­one. No­body can opt out of pub­lic space. So this had bet­ter work – and work well – for every­one. In­stead of ef­fi­ciency, smart cities should pro­mote open­ness, be trans­par­ent, and al­low for well-in­ten­tioned ‘hack­ing’ (in the sense of mod­i­fy­ing for un­ex­pected needs).

Fi­nally, the third fron­tier: al­go­rith­mic de­ci­sion­mak­ing or AI. Al­go­rithms make de­ci­sions that im­pact all ar­eas of our lives, from manag­ing re­source

al­lo­ca­tion ( netm.ag/2MvoU6J) to pre­dic­tive polic­ing ( netm. ag/2CXDTXU). And so we need to make sure that we un­der­stand the al­go­rithms – ef­fec­tively mak­ing them more open – in order to guar­an­tee ap­pro­pri­ate mech­a­nisms for gov­er­nance, ac­count­abil­ity and re­course. Gov­ern­ments need to un­der­stand that al­go­rith­mic de­ci­sion-mak­ing di­rectly af­fects peo­ple’s lives ( netm.ag/2QuCxGD).

Peo­ple are wary of emerg­ing tech­nolo­gies and you can’t blame them: large-scale data-driven sys­tems with lit­tle open­ness, over­sight, ac­count­abil­ity and trans­parency – in other words, sys­tems that aren’t built within an eth­i­cal, healthy frame­work – are likely to cause mas­sive dam­ages and un­in­tended con­se­quences. So let’s do bet­ter.

tech needs to be trust­wor­thy

To be clear, this isn’t an ex­er­cise in mak­ing con­sumers trust emerg­ing tech­nolo­gies more – it’s an ex­er­cise in mak­ing emerg­ing tech­nolo­gies more trust­wor­thy. To­day’s con­sumers don’t have good ways to make in­formed de­ci­sions about, say, a con­nected de­vice’s trust­wor­thi­ness. In his book Rad­i­cal Tech­nolo­gies, ( netm.

ag/2xanARZ) Adam Green­field sums up the dilemma: “Let’s be clear: none of our in­stincts will guide us in our ap­proach to the next nor­mal.” Gut feel­ing won’t cut it. We need bet­ter mech­a­nisms, de­sign prac­tices and tools.

Luck­ily, there are promis­ing ap­proaches to tackle this. As an in­dus­try, we must fol­low through with best prac­tices in all things data-re­lated. As con­sumers, we need to de­mand bet­ter from in­dus­try. And as cit­i­zens we need pol­icy mak­ers to get smart about reg­u­la­tion. For­tu­nately, af­ter the Snow­den rev­e­la­tions shook con­sumer trust in con­nected de­vices like never be­fore, things have been look­ing up.

Pol­icy mak­ers are slowly start­ing to get ahead of tech­nol­ogy, rather than play catch-up. The Euro­pean Gen­eral Data Pro­tec­tion Reg­u­la­tion (GDPR) has been the first ma­jor reg­u­la­tory ini­tia­tive in this space that tries to pro­tect con­sumer data at scale. (If and how the GDPR will play out over time re­mains to be seen.) Cal­i­for­nia fol­lowed up with the Cal­i­for­nia Con­sumer Pri­vacy Act, which of­fers GDPR-like pro­vi­sions.

In the tech in­dus­try, there is a grow­ing aware­ness of the need to de­sign emerg­ing tech to be bet­ter and more open – dig­i­tal well­be­ing ini­tia­tives by Ap­ple and Google and the de­bates on how to thwart fake news are just two cur­rent ex­am­ples of the in­dus­try try­ing to get their house in order.

Con­sumers ben­e­fit from all of this but they still haven’t had good tools to eval­u­ate which prod­ucts or com­pa­nies de­serve their trust. This, too, can change. As an ex­am­ple, take a con­crete project we have ini­ti­ated this year: the Trustable Tech Mark, a con­sumer trust mark for con­nected de­vices. De­vel­oped by the ThingsCon net­work with sup­port from Mozilla, the Trustable Tech Mark will soon start of­fer­ing an assess­ment frame­work to de­ter­mine which con­nected de­vices are trust­wor­thy. It looks at five di­men­sions: open­ness, pri­vacy & data prac­tices, se­cu­rity, trans­parency and sta­bil­ity.

The Trustable Tech Mark aims not just to weed out the re­ally in­fe­rior prod­ucts at the bot­tom of the pile but also to high­light the ones that are truly trust­wor­thy and em­ploy­ing – or es­tab­lish­ing – best prac­tices for user rights. For ex­am­ple, imag­ine an in­tel­li­gent smarthome as­sis­tant that does all the data pro­cess­ing on the de­vice with­out send­ing sen­si­tive data to the cloud. Or smart light­ing that avoids pri­vacy risks by not us­ing mi­cro­phones in its light bulbs. Or a com­pany that en­sures that in case of bank­ruptcy or an ac­qui­si­tion, user data re­mains safe and the code is re­leased as open source, so the prod­uct will work even af­ter the com­pany is gone.

The Trustable Tech Mark is only one of what we hope will be many ini­tia­tives to em­power con­sumers to make bet­ter-in­formed de­ci­sions and make emerg­ing tech more open. If in­dus­try, pol­icy mak­ers and con­sumers all can agree that trans­parency, de­cen­tral­i­sa­tion, ac­count­abil­ity and open­ness are con­di­tions that en­able trust in tech­nol­ogy, then we can look for­ward to an ex­cit­ing – rather than scary – decade of emerg­ing tech­nol­ogy. As de­sign­ers, de­vel­op­ers, and tech­nol­o­gists, we have an out­sized role to play in this jour­ney but we can – and should – also de­mand bet­ter as con­sumers. In­dus­try and pol­icy mak­ers will fol­low this pres­sure. In the end, all par­ties ben­e­fit from bet­ter, more trust­wor­thy emerg­ing tech.

“As we add more sen­sors to our phys­i­cal en­vi­ron­ment, we mul­ti­ply the po­ten­tial for abuse. If we want folks to use con­nected prod­ucts, we need to en­sure they can trust them”

Peter Bihr (@pe­ter­bihr) ex­plores the im­pact of emerg­ing tech­nolo­gies. He is a Mozilla Fel­low, founder and manag­ing di­rec­tor of The Wav­ing Cat and co-founder of ThingsCon.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.