Per­son­al­iza­tion vs Pri­vacy – Where do we draw the line?

The Insider - - AUDIENCE -

Data, big or small, has been a hot topic in pub­lish­ing and mar­ket­ing cir­cles for a num­ber of years now, her­alded as the se­cret to what sep­a­rates the wheat from the chaff in the race to rev­enue. But like all in­stru­ments of power, data must be han­dled with ten­der lov­ing care when it in­volves hu­man be­ings.

Re­cently I started think­ing about how data is be­ing used (and abused) by pub­lish­ers at­tempt­ing to cre­ate more per­son­al­ized ex­pe­ri­ences for con­sumers — the Holy Grail for dif­fer­en­ti­at­ing brands.

But as much as most peo­ple like the idea of per­son­ally-rel­e­vant in­for­ma­tion (ed­i­to­rial and ad­ver­to­rial) at their fin­ger­tips, when that data ac­tu­ally starts show­ing up in all the wrong places, many start hav­ing a creepy feel­ing that Big Brother is watch­ing them.

And given some re­cent sto­ries in the news and per­sonal ex­pe­ri­ences, it’s hard to mock that para­noia.

The slip­pery slope of our per­ish­ing pri­vacy

John Perry Bar­low, for­mer lyri­cist for the Grate­ful Dead, and co-founder of the Elec­tronic Fron­tier Foun­da­tion said, “Re­ly­ing on the gov­ern­ment to pro­tect your pri­vacy is like ask­ing a peep­ing Tom to in­stall your win­dow blinds.”

When I first read that, I im­me­di­ately thought, “There goes an­other con­spir­acy the­o­rist,” but Bar­low wasn’t far off from the truth. Pri­vacy to­day seems to be as fleet­ing as a Snapchat photo.

Look at the re­cent case of smart me­ters be­ing in­stalled in US homes. Be­ing able to an­a­lyze smart me­ter data has many po­ten­tial ben­e­fits such as giv­ing con­sumers in­sights into their en­ergy use. Not only does this help them un­der­stand the op­ti­mal times to use en­ergy through­out the day; min­i­miz­ing use dur­ing peak hours in­creases re­li­a­bil­ity and can save them money.

But, while rec­og­niz­ing that this is un­char­tered ter­ri­tory, smart me­ters can also re­veal pri­vate de­tails on what’s hap­pen­ing in the house. The me­ter data shows when you’re at home, away, sleep­ing or even show­er­ing — pat­terns that are highly val­ued by ne­far­i­ous home in­vaders.

A law­suit was filed to protest that me­tered data should be within the bounds of the Fourth Amend­ment to the Con­sti­tu­tion of the United States of Amer­ica — a bill of rights that pro­tects peo­ple from un­rea­son­able searches and seizures.

The mo­tion was ruled against by a fed­eral district court which stated that Amer­i­cans can’t rea­son­ably ex­pect any pri­vacy when it comes to the data col­lected by th­ese de­vices. The case is un­der ap­peal, but this is­sue has far-reach­ing im­pli­ca­tions.

The rapid growth of the In­ter­net of [hack­able] Things (IoT) is mak­ing our pri­vacy and se­cu­rity even more sus­cep­ti­ble to abuse, at least at this time.

For ex­am­ple, last Oc­to­ber in New Hamp­shire, vul­ner­a­bil­i­ties were ex­ploited in con­nected de­vices around the world to cre­ate a mas­sive Dis­trib­uted De­nial of Ser­vice (DDoS) at­tack that brought down ~1,000 web­sites in­clud­ing Twit­ter, Spo­tify, Net­flix, Ama­zon, Tum­blr, Red­dit and PayPal.

In Fe­bru­ary 2017, a smart doll called Cayla was banned in Ger­many by its Fed­eral Net­work Agency, Bun­desnet­za­gen­tur be­cause the doll can record and trans­mit other peo­ple’s con­ver­sa­tions without their knowl­edge, and then use that data to ad­ver­tise di­rectly to the child or par­ents. Does the toy maker ac­tu­ally abuse the power of that data? I don’t know, but many con­cerns cer­tainly ex­ist with other voice-con­trolled as­sis­tants such as Ap­ple’s Siri, Mi­crosoft’s Cor­tana, Ama­zon’s Echo, and Google Home. Just imag­ine what hack­ers could do if they got a hold of all that In­tel. It’s not just creepy — it’s down­right scary.

And let’s not for­get the Sam­sung and its “al­legedly” hack­able TVs.

Hit­ting too close to home

Too many times lately, I’ve been talk­ing to friends about a spe­cific sub­ject/prod­uct only to open my Face­book ac­count to see ads for it in my newsfeed. I know, I know, there’s no em­pir­i­cal ev­i­dence that proves that the so­cial ti­tan is guilty of un­scrupu­lous sur­veil­lance, but with a history full of du­plic­i­tous be­hav­ior, one can’t help but won­der…

Re­mem­ber back in 2009 Face­book promised that our per­sonal in­for­ma­tion was pri­vate, when in fact, it was shar­ing it with oth­ers — an abuse that re­sulted in charges from Fed­eral Trade Com­mis­sion?

Then there was the time that the Elec­tronic Pri­vacy In­for­ma­tion Cen­ter (EPIC) filed a mo­tion with the FTC ac­cus­ing Face­book of de­cep­tive trade prac­tices and vi­o­la­tion of a 2012 Con­sent Or­der when it was caught ma­nip­u­lat­ing news­feeds of ~700K users.

Just last sum­mer, Face­book flip-flopped on whether it was truly gen­er­at­ing friend rec­om­men­da­tions based purely on the lo­ca­tion of peo­ple’s smart­phones. Then it was found mining peo­ple’s cell phone num­bers — an abuse of pri­vacy that ended up hav­ing pa­tients of a psy­chi­a­trist see­ing each other in their “Peo­ple You May Know” boxes on Face­book.

Then to top it all off, in Oc­to­ber 2016, Face­book made it im­pos­si­ble for us to hide our pro­files from com­plete strangers. Face­book also agreed late last year to set­tle a class-ac­tion law­suit over al­le­ga­tions that it scanned pri­vate mes­sages be­tween users.

I could keep go­ing on, but I’m not here to bash Face­book (al­though they do leave them­selves wide open to it); search en­gines and other so­cial sites are also work­ing be­hind the scenes with our data.

The balance of power be­tween per­son­al­iza­tion and pri­vacy is a ten­u­ous one; and if the state of Mark Zucker­berg’s Mac is any in­di­ca­tion, he’s not tak­ing any chances ei­ther with pry­ing dig­i­tal stalk­ers.

The two sides of per­son­al­iza­tion

It feels like we are liv­ing in an era of dig­i­tal hy­per-per­son­al­iza­tion. Ac­cord­ing to Ac­cen­ture’s 2016 Per­son­al­iza­tion Pulse Check,

70% of con­sumers are gen­er­ally com­fort­able with news sites col­lect­ing per­sonal data if the pub­lisher is trans­par­ent about how it uses it; 75% were com­fort­able if they could per­son­ally con­trol how it was be­ing used.

What’s in­ter­est­ing is that 68% of re­spon­dents said they were highly sat­is­fied with the use of their per­sonal data by Net­flix and Hulu be­cause it helps them find shows that they like, de­spite the fact that there is lit­tle trans­parency or user con­trol of the data.

This in­con­sis­tency seems to in­di­cate that there is a clear line be­tween in­va­sive per­son­al­iza­tion tac­tics and help­ful ones.

In­va­sive per­son­al­iza­tion

We are bom­barded with con­tent ev­ery day, most of which is a pure waste of bits and bytes. From al­ter­na­tive facts and pro­pa­ganda to in­tru­sive, band­width-hun­gry ad­ver­tis­ing, it’s no won­der that pub­lish­ers’ at­tempts at per­son­al­iza­tion have be­come a ma­jor con­trib­u­tor to the rise of ad block­ing use.

Dwelling on this in­ter­net in­fes­ta­tion here isn’t go­ing to solve the prob­lem, but it’s worth point­ing out again that it’s not the users who are to blame for the US$41.4 bil­lion loss in rev­enues. The blame lies on the shoul­ders of pub­lish­ers who con­tinue to abuse vis­i­tors with an abysmal ad­ver­tis­ing ex­pe­ri­ence.

But this dead horse is not worth kick­ing any­more. In­stead, let’s talk about those busi­nesses that un­der­stand what qual­ity per­son­al­iza­tion is all about.

Help­ful per­son­al­iza­tion

If one were to stop a stranger on the street and ask them who they think of­fers the best on­line per­son­al­ized ser­vice, I would bet that Ama­zon, Net­flix, and Spo­tify would be near the top of the list. All have found a way to de­light cus­tomers through cre­ative use of their per­sonal data without them re­ally un­der­stand­ing what’s hap­pen­ing un­der the hood.

In terms of Net­flix, it all comes down to their highly in­tel­li­gent rec­om­men­da­tion sys­tem — a com­bi­na­tion of al­go­rithms fo­cused on en­gag­ing and re­tain­ing the in­ter­est of view­ers. It col­lects vast amounts of data that de­scribes what each mem­ber watches and how (de­vice type, time of day, and week, in­ten­sity of watch­ing, etc.), where each video was dis­cov­ered, and even what rec­om­men­da­tions were suggested but not viewed.

Spo­tify adopted a Face­book-like newsfeed form of per­son­al­ized and fre­quently up­dated playlists called Dis­cover Weekly — an al­go­rithm that an­a­lyzes a per­son’s lis­ten­ing history, com­bines it with what’s new and hot on Spo­tify, and up­dates it each week with 30 new songs it thinks users will like. Last year it took per­son­al­iza­tion to an­other level with Radar — an al­go­rith­mi­cally-per­son­al­ized weekly playlist of newly re­leased songs from artists each user al­ready likes.

This un­der-the-radar (ex­cuse the pun) data crunch­ing flies in the face of trans­parency, but the fact is that both com­pany’s use of per­sonal data is in the best in­ter­est of the users. It’s not out to try and sell them some­thing, but rather to in­crease en­gage­ment by im­prov­ing their view­ing and lis­ten­ing ex­pe­ri­ence. That ex­pe­ri­ence is a worth the price of ad­mis­sion — i.e. ac­cess to their per­sonal view­ing/lis­ten­ing pref­er­ences.

Years be­fore Net­flix in­tro­duced all-ac­cess video stream­ing to the in­ter­net, PressReader of­fered an all-you-can read news plat­form for read­ers of all ages — a ser­vice whose prime ob­jec­tive was to cre­ate an en­gag­ing user ex­pe­ri­ence in the dis­cov­ery and con­sump­tion of con­tent.

It’s never easy try­ing to serve mul­ti­ple de­mo­graph­ics with a sin­gle prod­uct, es­pe­cially when that prod­uct in­cludes a grow­ing list of thousands of sources. Be­cause when it comes to news con­sump­tion, one size def­i­nitely does not fit all.

To fa­cil­i­tate “fric­tion­less dis­cov­ery” and serve to­day’s multi-gen­er­a­tion of read­ers, we needed to of­fer high-qual­ity con­tent and a rec­om­men­da­tion en­gine seam­lessly em­bed­ded in the newsfeed that tracked much more than clicks, likes, and shares, in­clud­ing:

• What’s trend­ing — i.e. con­tent that re­tained other read­ers’ at­ten­tion the long­est

• Con­tent that is rel­e­vant based on what the user is read­ing now and has con­sumed be­fore

• Ar­ti­cles that oth­ers, who have also read the same ar­ti­cle, en­joyed

• Sto­ries that peo­ple with sim­i­lar tastes have en­gaged with

Then, to en­sure that the con­tent pre­sen­ta­tion was op­ti­mized for any de­vice and any browser, we used a pre­dic­tive al­go­rithm that built, on the fly, all pos­si­ble lay­outs of the con­tent for what­ever de­vice a reader used, test each of them against a de­vice/ browser’s ca­pa­bil­i­ties, and then choose the best one for that reader. Be­cause, as I’ve said be­fore, it’s all about de­liv­er­ing the right con­tent to the right per­son at the right time, the way they want to re­ceive it.

Cus­tomer first – qual­ity al­ways

If there was any­thing you should take away from this ar­ti­cle it’s that cus­tomer-first in­no­va­tion must be your num­ber one pri­or­ity, re­gard­less of your in­dus­try or tar­get au­di­ence. Be­cause, at the end of day, all that mat­ters is the con­sumer and their ex­pe­ri­ence with your brand — an ex­pe­ri­ence that must get bet­ter and smarter the more they in­ter­act with it.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.