Your data or your life

Financial Mirror (Cyprus) - - FRONT PAGE -

Ap­ple’s new watch keeps track of your health. Google Now gath­ers the in­for­ma­tion needed to com­pute the ideal time for you to leave for the air­port. Ama­zon tells you the books you want, the gro­ceries you need, the films you will like – and sells you the tablet that en­ables you to or­der them and more. Your lights turn on when you get close to home, and your house ad­justs to your choice of am­bi­ent tem­per­a­ture.

This amal­ga­ma­tion and syn­the­sis of dig­i­tal ser­vices and hard­ware is de­signed to make our lives eas­ier, and there is no doubt that it has. But have we stopped ask­ing fun­da­men­tal ques­tions, both of our­selves and of the com­pa­nies we en­trust to do all of th­ese things? Have we given suf­fi­cient con­sid­er­a­tion to the po­ten­tial cost of all of this com­fort and ease, and asked our­selves if the price is worth it?

Ev­ery time we add a new de­vice, we give away a lit­tle piece of our­selves. We of­ten do this with very lit­tle knowl­edge about who is get­ting it, much less whether we share their ethics and val­ues. We may have a su­per­fi­cial check-box un­der­stand­ing of what the com­pa­nies be­hind this con­ve­nience do with our data; but, beyond the mar­ket­ing, the ac­tual peo­ple run­ning th­ese or­gan­i­sa­tions are face­less and name­less. We know lit­tle about them, but they sure know a lot about us.

The idea that com­pa­nies can know where we are, what we have watched, or the con­tent of our med­i­cal records was anath­ema a gen­er­a­tion ago. The vast ar­ray of de­tails that de­fined a per­son was widely dis­trib­uted. The bank knew a bit, the doc­tor knew a bit, the tax au­thor­ity knew a bit, but they did not all talk to one another. Now Ap­ple and Google know it all and store it in one handy place. That is great for con­ve­nience, but not so great if they de­cide to use that in­for­ma­tion in ways with which we do not proac­tively agree.

And we have rea­son to call into ques­tion com­pa­nies’ judg­ment in us­ing that data. The back­lash to the news that Face­book used peo­ple’s news feeds to test whether what they viewed could al­ter their moods was proof of that. I do not re­call check­ing a box to say that that was okay. Re­cently, hack­ers mis­ap­pro­pri­ated pho­tos sent via Snapchat, a ser­vice used pri­mar­ily by young peo­ple that prom­ises auto-dele­tion of all files upon view­ing.

Like­wise, health-care data were al­ways con­sid­ered pri­vate, so that pa­tients would be open and hon­est with health-care pro­fes­sion­als. As the lines be­tween health care and tech­nol­ogy busi­nesses be­come hazy, some man­u­fac­tur­ers of “wear­ables” and the soft­ware that runs on them are lob­by­ing to have their prod­ucts ex­empted from be­ing con­sid­ered med­i­cal de­vices – and thus from reg­u­la­tory re­quire­ments for re­li­a­bil­ity and data pro­tec­tion.

Pri­vacy is only one part of a larger dis­cus­sion around data own­er­ship and data mo­nop­oly, se­cu­rity, and com­pe­ti­tion. It is also about con­trol and des­tiny. It is about choice and proac­tively de­cid­ing how peo­ple’s data are used and how peo­ple use their own data.

More ma­ture firms have phased in for­mal pro­to­cols, with ethics of­fi­cers, risk com­mit­tees, and other struc­tures that over­see how data are col­lected and used, though not al­ways suc­cess­fully (in­deed, they of­ten de­pend on trial and er­ror). Small new com­pa­nies may have nei­ther such pro­to­cols nor the peo­ple – for ex­am­ple, in­de­pen­dent board mem­bers – to im­pose them. If se­ri­ous eth­i­cal lapses oc­cur, many con­sumers will no longer use the ser­vice, re­gard­less of how promis­ing the business model is.

We like new ap­pli­ca­tions and try them out, hand­ing over ac­cess to our Face­book or Twit­ter ac­counts with­out much thought about the mi­gra­tion of our per­sonal data from big com­pa­nies with some mod­icum of over­sight to small com­pa­nies with­out rig­or­ous struc­tures and lim­its. Con­sumers be­lieve or ex­pect that some­one some­where is keep­ing an eye on this, but who ex­actly would that be?

In Europe, leg­is­la­tion to pro­tect per­sonal data is not com­pre­hen­sive, and much of the rest of the world lacks even rudi­men­tary safe­guards. After ex­plor­ing this is­sue with legislators in sev­eral coun­tries over the past cou­ple of months, it has be­come abun­dantly clear that many do not have a full grasp of the myr­iad is­sues that need to be con­sid­ered. It is a dif­fi­cult sub­ject to ad­dress, and do­ing so is im­peded by lob­by­ing ef­forts and in­com­plete in­for­ma­tion.

In the short term, young com­pa­nies should view ethics not as a mar­ket­ing gim­mick, but as a core con­cern. All or­gan­i­sa­tions should invest in ethics of­fi­cers or some sort of re­view process in­volv­ing peo­ple who can as­sess all of the im­pli­ca­tions of a great-sound­ing idea. Legislators need to ed­u­cate them­selves – and the pub­lic – and ex­er­cise more over­sight. For ex­am­ple, just as many coun­tries did with car seat­belts a gen­er­a­tion ago, a pub­lic-safety cam­paign could be paired with leg­is­la­tion to ex­plain and pro­mote two-step ver­i­fi­ca­tion.

In the longer term, as we rightly move to­ward univer­sal In­ter­net ac­cess, we need to ask: How much of our­selves are we will­ing to give away? What hap­pens when shar­ing be­comes manda­tory – when giv­ing ac­cess to a per­sonal Face­book ac­count is a job re­quire­ment, and health ser­vices are with­held un­less a pa­tient sub­mits their his­tor­i­cal Fit­bit data?

If that is the fu­ture we want, we should stride to­ward it with full aware­ness and a sense of pur­pose, not me­an­der care­lessly un­til we fall into a hole, look up, and won­der how we got there.

Newspapers in English

Newspapers from Cyprus

© PressReader. All rights reserved.