Bank­ing on Sur­veil­lance Tech­nol­ogy

As tech­nol­ogy com­pa­nies do more with pay­ment card data, banks must de­cide whether to wel­come the prac­tice or stand in its way.

ISO & Agent - - INSIDE 09/10.2017 - BY PENNY CROSMAN

As pay­ment card data gets eas­ier to track, is­suers and other pay­ments com­pa­nies must de­cide whether to fa­cil­i­tate this data’s use or re­strict it.

Google’s track­ing of credit card pur­chases and link­ing them to users’ on­line pro­files and search pat­terns raises a num­ber of knotty ques­tions for banks. The tech gi­ant says it just wants to show ad­ver­tis­ers that the ads they placed led to sales, and there’s no rea­son to doubt the com­pany’s in­ten­tion. But if con­sumers un­der­stood that their card trans­ac­tion data was be­ing sold to Google, would they sanc­tion this? Or would they ask the banks and card is­suers that col­lect and store their trans­ac­tion data to think care­fully and per­haps ask their con­sent be­fore pass­ing this in­for­ma­tion over to third par­ties?

These ques­tions are rel­e­vant to banks be­cause they are com­plicit in the march to­ward “sur­veil­lance cap­i­tal­ism” — a

world where con­sumers’ ev­ery move is recorded with­out their knowl­edge and the in­for­ma­tion is mon­e­tized.

Al­though it is sel­dom talked about, some banks sell cus­tomer data to third par­ties. Banks also feed cus­tomer data to data ag­gre­ga­tors such as Yodlee, which anonymize and sell that in­for­ma­tion to third par­ties such as hedge funds. The hedge funds use it to pre­dict com­pany per­for­mance and make trad­ing de­ci­sions. Mastercard and Visa also sell card trans­ac­tion data to third par­ties.

Most banks have a vested in­ter­est in mak­ing sure cus­tomer data is not mis­used in any way, since they in­cur a lot of the costs of fraud, such as card reis­suance and credit mon­i­tor­ing, said Al Ray­mond, spe­cial­ist leader, pri­vacy and data pro­tec­tion at Deloitte and for­mer head of U.S. pri­vacy at TD Bank.

“They want to stay close to the data, as the ounce of sur­veil­lance preven­tion more than out­weighs the pound of cure,” Ray­mond said. “Stay­ing close to the cus­tomer is a very real, and re­cent goal, since large banks are try­ing to fight off the threat from smaller, nim­bler fin­tech play­ers that are chip­ping away at bank cus­tomers, par­tic­u­larly the mil­len­nial seg­ment.”

U.S. banks can’t sell raw con­sumer data to third par­ties un­less they pro­vide the cus­tomer with a no­tice and an op­por­tu­nity to opt out. In some states, cus­tomers have to opt in. They can sell anonymized data, with all per­son­ally iden­ti­fi­able in­for­ma­tion stripped out or hashed. Com­pa­nies of­ten use such data sets to come up with rules they can ap­ply to their cus­tomer base, such as: a 42-year-old liv­ing on the Up­per West Side of Man­hat­tan is likely to pur­chase cer­tain items at a par­tic­u­lar time of day.

But even anonymized data raises at least three pri­vacy is­sues.

1. Anonymized data can be de-anonymized

In a re­port pub­lished in April, Stan­ford and Prince­ton re­searchers de- scribed how they linked de-iden­ti­fied web brows­ing his­to­ries to so­cial me­dia pro­files us­ing only pub­licly avail­able data.

“This was a case study in how much peo­ple share on the in­ter­net with­out even re­al­iz­ing it and how uniquely iden­ti­fi­able that is,” said Jes­sica Su, com­puter science PH.D. stu­dent at Stan­ford Univer­sity and an au­thor of the re­port. “We be­lieve that the set of things that are rec­om­mended to some­body or the set of web­sites some­body uses on the in­ter­net very much uniquely iden­tify them.”

The team asked Twit­ter users to do­nate their brows­ing his­to­ries and then looked at the links they clicked on while vis­it­ing Twit­ter. They mapped the news­feeds to the brows­ing his­to­ries, and where there were many sim­i­lar­i­ties they made a match. They suc­cess­fully iden­ti­fied peo­ple around 70% of the time. If they had used the time stamps on the brows­ing his­to­ries and Twit­ter posts, that rate would prob­a­bly have been much higher, Su said.

Su said she isn’t con­cerned that Google or any other large com­pany would take ad­van­tage of this abil­ity to iden­tify peo­ple, and thus know that the in­di­vid­ual who bought a pink cardi­gan from at 6:45 a.m. on June 7 was me.

Com­pa­nies like Google and Face­book “usu­ally have clear poli­cies on what they can do with user data. When I was at Mi­crosoft re­search, there were very strict con­trols on what could be done with so-called per­son­ally iden­ti­fi­able data. The moral of this story is that a lot of in­for­ma­tion is per­son­ally iden­ti­fi­able.”

Ag­gre­gate in­for­ma­tion used to find pat­terns and for­mu­las is far more valu­able than the ac­tiv­i­ties of in­di­vid­u­als, said Boris Se­galis, co-chair, data pro­tec­tion, pri­vacy and cy­ber­se­cu­rity at Nor­ton Rose Ful­bright, a New York City-based law firm,

“They don’t care that you bought the cardi­gan,” Se­galis said. “That’s low-value in­for­ma­tion.”

What Su does worry about is “the sketchi­est small com­pa­nies,” she said.

“If some­body re­leases some data set that’s very pri­vate and sen­si­tive but anony­mous, and some­body else goes and de-anonymizes it us­ing sta­tis­ti­cal meth­ods, that could be pretty bad.”

Stu­art Lacey, founder and CEO of the data pri­vacy tech­nol­ogy com­pany Trunomi, is alarmed about com­pa­nies’

be­ing able to con­nect these dots.

“The ex­tent to which this is be­ing done and what we’re just find­ing out now is a gulf that will be filled with al­li­ga­tors and sur­prises,” he said. “I don’t think many peo­ple re­al­ize just how much is be­ing done.”

2. Even if it re­mains anonymized, con­sumers don’t know how their data is be­ing used

Do con­sumers have the right to know where their data is be­ing sent even if it’s anonymized?

“That’s a ques­tion of pol­icy,” Se­galis said. “You can cer­tainly have the view that you don’t want some­one in the com­mer­cial space to fig­ure out your shop­ping pat­terns. Ul­ti­mately you as a con­sumer gen­er­ate that data, even if it’s not as­so­ci­ated with you.”

On the other hand, the use of more and bet­ter data and an­a­lyt­ics could ben­e­fit cus­tomers’ as well as com­pa­nies, Se­galis ar­gued.

“It could be an­noy­ing that some­one can pre­dict your shop­ping pat­terns,” he said. “But the same data an­a­lyt­ics tools are used to pre­dict traf­fic pat­terns and make driv­ing safer, to help with phar­ma­co­log­i­cal re­search. … It’s prob­a­bly hard to stop data be­cause it drives so much busi­ness to­day.”

3. If you try to ex­plain to con­sumers how their data is be­ing used, they prob­a­bly won’t read the ex­pla­na­tion.

Banks have to pro­vide pri­vacy no­tices that dis­close what they do with cus­tomer data, but of­ten the use­ful in­for­ma­tion is buried in legalese.

Lacey pointed out that Ap­ple’s itunes agree­ment runs 3,600 words on 27 pages. “No one reads it,” he said.

He’s in fa­vor of us­ing a con­sent wid­get that would clearly state what in­for­ma­tion is be­ing shared with whom for how long. (The Euro­pean Union’s Gen­eral Data Pro­tec­tion Reg­u­la­tion­calls this “in­formed con­sent.”)

“That’s the way the cus­tomers and the banks we talk to see this go­ing — ju­di­cious and ap­pro­pri­ate, mea­sured use of data,” Lacey said.

He points to Twit­ter’s new pri­vacy pol­icy and the con­sent form it pushed to users re­cently, as a good ex­am­ple.

“They had eight clean sen­tences with lit­tle slid­ers be­side ev­ery one,” he said. “It was well run, they asked rel­e­vant ques­tions, they ex­plained it in un­der­stand­able lan­guage and I opted out of ev­ery­thing.”

He won­ders, how­ever, how many among Twit­ter’s 328 mil­lion monthly users both­ered to look at the con­sent agree­ment.

“Peo­ple be­come so blinded by, ‘Yes, just get me on my Twit­ter, I just need to share this thing,’ ” he said.

Se­galis also be­moaned ul­tra­long pri­vacy no­tices. “They try to de­scribe ev­ery­thing a com­pany does on that piece of pa­per be­cause of plain­tiffs’ lawyers and con­sumer ad­vo­cacy groups,” he said. “It’s eas­ier for them to be su­per de­tailed than to think about it like Twit­ter thought about it.”

The worst-case sce­nario

Lacey paints a dark pic­ture of the fu­ture if data con­tin­ues to be shared thought­lessly.

“The more a few par­ties have more in­for­ma­tion about any one thing and they can con­trol the flow of rel­e­vancy of in­for­ma­tion and what you see, what you do, it starts to be­come a lit­tle Or­wellian,” he said.

One day the pink cardi­gan I buy from J. Crew will have a near-field com­mu­ni­ca­tion chip in it, Lacey said. The chip will be de­signed to be read by my wash­ing ma­chine, which will warn me not to put it in with cer­tain other fab­rics.

“That seems like a good use case,” he said. But Nfc-en­abled cloth­ing will be­come an iden­ti­fier that can be used to lo­cate peo­ple.

“Now we’ve got a whole mech­a­nism for mass sur­veil­lance glob­ally and now, all these com­pa­nies will be trad­ing off that in­for­ma­tion to not just fig­ure out who you are and what you buy, but

lo­cate you and fig­ure out your habits,” Lacey said.

“What I worry about is we’re not tak­ing any­where near enough time to un­der­stand the way in which we’re col­lect­ing data about peo­ple, what it’s be­ing used for, and by whom for what rea­son,” Lacey said.

Not ev­ery­one shares this dire view. “My ex­pe­ri­ence in work­ing with le­git­i­mate, large com­pa­nies, is they have chief pri­vacy of­fi­cers and they try to do the right thing,” Se­galis said. “They use this data for their own pur­poses, be­cause in many cases it opens op­por­tu­ni­ties and makes them more prof­itable. I don’t think we can roll that back.”

But Lacey said he sees the EU’S data pro­tec­tion reg­u­la­tion as a gift that could guide com­pa­nies back to the light, “like Luke in ‘Star Wars,’ when he’s bal­anc­ing on the ship and you don’t know if he’s go­ing to go bad or good. Sud­denly, if guided the right way, you can make good choices and the re­sult can be quite com­pelling.”

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.