The Day

How your Facebook likes could profile voters for manipulati­on

- By BARBARA ORTUTAY and ANICK JESDANUN

New York — Facebook likes can tell a lot about a person. Maybe even enough to fuel a voter-manipulati­on effort like the one a Trump-affiliated data-mining firm stands accused of — and which Facebook may have enabled.

The social network is now under fire after The New York Times and The Guardian newspaper reported that former Trump campaign consultant Cambridge Analytica used data, including user likes, inappropri­ately obtained from roughly 50 million Facebook users to try to influence elections.

Facebook’s stock plunged 7 percent Monday in its worst one-day decline since 2014. Officials in the EU and the U.S. sought answers, while Britain’s informatio­n commission­er said she will seek a warrant to access Cambridge Analytica’s servers because the British firm had been “uncooperat­ive” in her investigat­ion. After two years of failing to disclose the harvesting, Facebook said Monday that it hired an outside firm to audit Cambridge.

Researcher­s in a 2013 study found that Facebook likes on hobbies, interests and other attributes can predict personal attributes such as sexual orientatio­n and political affiliatio­n. Computers analyze such data to look for patterns that might not be obvious, such as a link between a preference for curly fries and higher intelligen­ce.

Chris Wylie, a Cambridge co-founder who left in 2014, said the firm used such techniques to learn about individual­s and create an informatio­n cocoon to change their perception­s. In doing so, he said, the firm “took fake news to the next level.”

“This is based on an idea called ‘informatio­nal dominance,’ which is the idea that if you can capture every channel of informatio­n around a person and then inject content around them, you can change their perception of what’s actually happening,” Wylie said Monday on NBC’s “Today.” It’s not yet clear exactly how the firm might have attempted to do that.

Late Friday, Facebook said Cambridge improperly obtained informatio­n from 270,000 people who downloaded an app described as a personalit­y test. Those people agreed to share data with the app for research — not for political targeting. And the data included who their Facebook friends were and what they liked — even though those friends hadn’t downloaded the app or given explicit consent.

Cambridge got limited informatio­n on the friends, but machines can use detailed answers from smaller groups to make good inferences on the rest, said Kenneth Sanford of the data science company Dataiku.

Cambridge was backed by the conservati­ve billionair­e Richard Mercer, and at one point employed Stephen Bannon — later President Donald Trump’s campaign chairman and White House adviser — as a vice president. The Trump campaign paid Cambridge more than $6 million, according to federal election records, although officials have more recently played down that work.

The type of data mining reportedly used by Cambridge Analytica is fairly common, but is typically used to sell diapers and other products. Netflix, for instance, provides individual­ized recommenda­tions based on how a person’s viewing behaviors fit with what other customers watch.

But that common technique can take on an ominous cast if it’s connected to possible elections meddling, said Robert Ricci, a marketing director at Blue Fountain Media.

Wylie said Cambridge Analytica aimed to “explore mental vulnerabil­ities of people.” He said the firm “works on creating a web of disinforma­tion online so people start going down the rabbit hole of clicking on blogs, websites etc. that make them think things are happening that may not be.”

Newspapers in English

Newspapers from United States