On­line data con­cerns go far be­yond pri­vacy

The Denver Post - - OPINION - By Mike Skir­pan

There is no short­age of news telling us all about the end of pri­vacy. Though we are cau­tioned this may be trending us to­ward dystopia, in re­al­ity, most con­sumers opt for the con­ve­nience and ne­ces­sity of uti­liz­ing tech­nol­ogy prod­ucts for the mea­ger cost of giv­ing away their data.

We look past the mild creepi­ness of some­one else own­ing and an­a­lyz­ing our data and rea­son the trade off is a fair one. Tar­geted mar­ket­ing and per­son­al­ized ser­vices are good things, right?

What is of­ten over­looked is that the util­ity of this data for com­pa­nies goes far be­yond telling you your fa­vorite mu­si­cian is com­ing to town or that the shoes you have been eye­ing are on sale.

The two most alarm­ing fac­tors of our data-driven world that lie in the back­ground are pre­dic­tive in­fer­ence and al­go­rith­mic de­ci­sion-mak­ing. Un­til the gen­eral pub­lic sees the fu­ture of our data be­ing owned, bought, and sold by com­pa­nies, not as the death of pri­vacy, but as the in­tro­duc­tion of new so­ci­ety of au­to­mated de­ci­sions and sub­tle ma­nip­u­la­tions, we will con­tinue to over­look the true con­se­quence.

In short, pre­dic­tive in­fer­ence is the idea that once any­one has a lot of data about a pop­u­la­tion of in­di­vid­u­als, they can be­gin mak­ing sta­tis­ti­cal guesses about those peo­ple.

Con­sider the fact that Face­book could guess some­one’s sex­ual ori­en­ta­tion or re­li­gious af­fil­i­a­tion very ac­cu­rately by sim­ply an­a­lyz­ing their Face­book likes. Or per­haps more dis­turb­ing, Face­book was cat­e­go­riz­ing teenagers by their in­se­cu­ri­ties and vul­ner­a­bil­i­ties. And we are only at the tip of the ice­berg.

Within re­search cir­cles, the cut­ting edge shows an abil­ity to ac­cu­rately pre­dict whether some­one has a men­tal health ill­ness from their so­cial me­dia posts, or what their body mass in­dex is from a sin­gle photo. What th­ese trends il­lu­mi­nate is that us­ing peo­ple’s data to in­fer in­for­ma­tion be­yond what is ac­tively dis­closed is the way of the fu­ture.

None of us should be sur­prised if the next decade or two in­volve health in­sur­ance com­pa­nies start levy­ing premi­ums not on our med­i­cal records alone, but also what they can in­fer about our life­style and health us­ing our data.

The other part of our rapidlyap­proach­ing fu­ture is au­ton­o­mous Ar­ti­fi­cial In­tel­li­gence sys­tems that shape our op­por­tu­ni­ties and treat­ment.

Very sim­ply, our data is also be­ing traded and used as train­ing data to make ma­chine-in­tel­li­gent sys­tems that per­form very im­pres­sive tasks with no hu­man aid. There are ex­cit­ing di­men­sions of this work as AI is likely to im­prove ex­perts’ abil­i­ties to do their work and al­low the masses to have ma­chine as­sis­tance to per­form tasks that oth­er­wise re­quire years of train­ing (if you can af­ford it, of course).

How­ever, it is im­por­tant to fur­ther rec­og­nize that th­ese sys­tems are be­ing used to de­ter­mine prison sen­tenc­ing, de­cide what news ar­ti­cles we see, and fil­ter down to job and col­lege ap­pli­ca­tions. In the cur­rent ar­range­ment, nearly ev­ery ac­tion you take on­line or in a mo­bile app is cap­tured to be uti­lized as train­ing data for one of th­ese ma­chine-in­tel­li­gent sys­tems. Thus, the data we cre­ate and give away freely not only gets turned into grand profits for a small tech elite, but also is con­verted into real, tan­gi­ble sys­tems that shape our lives, and in­creas­ingly so.

It is this in­vis­i­ble trans­fer of power that, whether you sup­port it or not, makes it worth your time to be aware.

With all this said, it should make a bit more sense why Jeff Be­zos was in­ter­ested in buy­ing Whole Foods and why Google rapidly ac­quires com­pa­nies that gain any siz­able user pop­u­la­tion. It is not about in­vad­ing your pri­vacy. Rather, it is about gain­ing in­tel­li­gence and build­ing new sys­tems.

So as you go for­ward choos­ing with whom to share data and what to share, please do not fall prey to the red her­ring of pri­vacy. Re­mem­ber that each click lays an­other brick for our fu­ture so­ci­ety. And it might just be time to form an opin­ion be­fore it’s au­to­mat­i­cally de­cided for you.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.