The Asian Age

Facebook ‘ likes’ profile voters for manipulati­on

-

Facebook “likes" can tell a lot about a person. Maybe even enough to fuel a vote- manipulati­on effort like the one a Trumpaffil­iated data- mining firm stands accused of — and which Facebook may have enabled.

The social network is under fire after The New York Times and The Guardian newspaper reported that former Trump campaign consultant Cambridge Analytica used data, including user likes, inappropri­ately obtained from roughly 50 million Facebook users to try to influence elections.

Monday was a wild roller coaster ride for Facebook, whose shares plunged 7 per cent in its worst one- day decline since 2014. Officials in the EU and the US sought answers, while Britain's informatio­n commission­er said she will seek a warrant to access Cambridge Analytica's servers because the British firm had been “uncooperat­ive" in her investigat­ion. The first casualty of that investigat­ion was an audit of Cambridge that Facebook had announced earlier in the day; the company said it “stood down" that effort at the request of British officials.

Adding to the turmoil, the New York Times reported that Facebook security chief Alex Stamos will step down by August following clashes over how aggressive­ly Facebook should address its role in spreading disinforma­tion. In a tweet, Stamos said he's still fully engaged on Facebook but that his role has changed.

It would have been quieter had Facebook likes not turned out to be so revealing. Researcher­s in a 2013 study found that likes on hobbies, interests and other attributes can predict personal attributes such as sexual orientatio­n and political affiliatio­n. Computers analyse such data to look for patterns that might not be obvious, such as a link between a preference for curly fries and higher intelligen­ce. It's not yet clear exactly how the firm might have attempted to do that.

Late Friday, Facebook said Cambridge improperly obtained informatio­n from 2,70,000 people who downloaded an app described as a personalit­y test. Those people agreed to share data with the app for research — not for political targeting. And the data included who their Facebook friends were and what they liked — even though those friends hadn't downloaded the app or given explicit consent.

Cambridge got limited informatio­n on the friends, but machines can use detailed answers from smaller groups to make good inferences on the rest, said Kenneth Sanford of the data science company Dataiku.

Cambridge was backed by the conservati­ve billionair­e Richard Mercer, and at one point employed Stephen Bannon — later President Donald Trump's campaign chairman and White House adviser — as a vice president. The type of data mining reportedly used by Cambridge Analytica is fairly common but is typically used to sell diapers and other products.

Wylie told “Today" that while political ads are also targeted at specific voters, the Cambridge effort aimed to make sure people wouldn't know they were getting messages aimed at influencin­g their views.

The Trump campaign has denied using Cambridge's data. The firm itself denies wrongdoing and says it didn't retain any of the data pulled from Facebook and didn't use it in its 2016 campaign work.

It's possible that Cambridge tapped other data sources, including what Cruz's campaign app collected. Nix said during the Cruz campaign that it had five or six sources of data on each voter.

Facebook declined to provide officials for the interview and didn't immediatel­y respond to requests for informatio­n beyond its statements Friday and Monday. Cambridge also didn't immediatel­y respond to emailed questions. Facebook makes it easy for advertiser­s to target users based on nuanced informatio­n about them.

Facebook's mapping of the “social graph" — essentiall­y the web of people's real- life connection­s — is also invaluable for marketers. Two- thirds of Americans get at least some of their news on social media, according to on Pew Research Center. While people don't exist in a Facebook only vacuum, it is possible that bogus informatio­n users saw on the site could later be reinforced by the “rabbit hole" of clicks and conspiracy sites on the broader internet, as Wylie described.

 ??  ??
 ??  ?? PHOTO : PIXABAY
PHOTO : PIXABAY

Newspapers in English

Newspapers from India