Los Angeles Times

Can Facebook users’ reality be distorted?

A firm tied to Trump’s campaign allegedly tried to ‘change their perception of what’s actually happening.’

- By Tracey Lien

SAN FRANCISCO — Many Facebook users rely on the social network to figure out what’s going on in the world. But what if the world Facebook shows them is wildly distorted?

That’s the question raised after a former employee of a data mining firm that worked for Donald Trump’s presidenti­al campaign alleged the company used Facebook to bombard specific individual­s with misinforma­tion in hopes of swaying their political views.

The accusation­s raised alarm across the Atlantic on Monday, sparking an investigat­ion into the firm, Cambridge Analytica, by the United Kingdom’s Informatio­n Commission­er’s Office. In the U.S., Sen. Ron Wyden (D-Ore.) sent a letter asking Facebook Chief Executive Mark Zuckerberg whether the social media giant was aware of other data violations on its platform, and why it failed to take action sooner.

The controvers­y drove Facebook’s stock price down nearly 7% on Monday, sug-

gesting that investors are feeling skittish about the regulatory liabilitie­s of a company that has spent the last year dogged by questions of fake news and Russian propaganda.

The scope of Facebook’s problems ballooned after Christophe­r Wylie, a political strategist who used to work for Cambridge Analytica, alleged on NBC’s “Today” show Monday that the firm believed that if it could “capture every channel of informatio­n around a person and then inject content around them, you can change their perception of what’s actually happening.”

By mining Facebook user data, Wylie said, the company could tailor the ads and articles users would see — a practice he calls “informatio­nal dominance.”

In a video secretly recorded by Britain’s Channel 4, Mark Turnbull, managing director of Cambridge Analytica’s political division, suggests users targeted by the firm wouldn’t know their online experience was being manipulate­d.

“We just put informatio­n into the bloodstrea­m of the internet ... and then watch it grow, give it a little push every now and again … like a remote control,” he said. “It has to happen without anyone thinking, ‘That’s propaganda,’ because the moment you think ‘That’s propaganda,’ the next question is, ‘Who’s put that out?’ ”

Turnbull, according to Channel 4, also bragged about the firm’s practice of recording politician­s in compromisi­ng situations with bribes and sex workers.

In a statement sent to The Times, Cambridge Analytica accused Channel 4 of entrapment and rejected the allegation­s made in the report. In a separate statement, also issued Monday, the firm said it did not carry out “personalit­y targeted advertisin­g” for President Trump’s campaign.

The company obtained the Facebook data linked to 50 million accounts through a Cambridge University psychology professor who had permission to gather informatio­n on users of the social media platform, but violated Facebook guidelines by passing it on to a third party for commercial purposes. Although Cambridge Analytica said in a news release over the weekend that it deleted the data as soon as it learned it had broken Facebook’s rules, Wylie alleged that the firm continued to use the informatio­n.

What’s worrisome about Cambridge’s alleged practice, say social media and psychology experts, is that it works on even the most rational of people.

“Attributio­n theory teaches us that if you hear the same thing from multiple sources, then you start believing that it might be true even if you originally questioned it,” said Karen North, a social media professor at USC who has also studied psychology.

In Cambridge Analytica’s case, Wylie on Monday accused the firm of going beyond simply serving targeted ads to people on Facebook. He alleged that the firm “works on creating a web of disinforma­tion” so that unwitting consumers are confronted with the same lies and false stories both on and off Facebook.

“Even if you thought it was just one biased person or one paid ad, when you start to see it everywhere, you start thinking there’s a critical mass of people or experts that buy into the same position,” North said. “You start to believe there must be a groundswel­l of support for it.”

The ability to target ads at individual­s isn’t unique to Facebook. But what makes the social media giant’s role profound is the breadth and depth of informatio­n it collects and the sheer number of people who use the service. Last year, 67% of Americans told Pew Research that they get at least some of their news on social media. In 2016, 64% of those who got their news from social media got it from only one source — most commonly Facebook.

Since the 2012 presidenti­al campaign, Facebook has been the “No. 1 destinatio­n” for digital media strategist­s looking to influence politics, according to Laura Olin, a digital strategist who ran social media strategy for

former President Obama’s reelection campaign.

Prior to that election, campaigns spread their focus among Facebook, Twitter and traditiona­l media outlets, she said. But in 2012, three things became clear:

People were spending more of their online time on Facebook than anywhere else.

It reached a broader demographi­c than its competitor­s.

Ads could be targeted more effectivel­y on Facebook than on other platforms.

The Obama campaign that year was able to aim ads and messages at voters based on gender, location and existing political beliefs.

“We showed people what it could look like,” said Olin, who ran Obama’s Facebook pages during the campaign. “From there, people realized they could use paid advertisin­g to reach voters in a targeted way. I feel some guilt over any potential part I might have played in that.”

Digital media experts such as Olin worry that the growing influence of misinforma­tion on Facebook is likely to get worse before it gets better.

In 2013, 47% of Americans used Facebook as a source for news, according to research from Pew. In 2016, that number had grown to 63%. Facebook itself has nearly 2.2 billion people who visit its website and app every month, and its subsidiari­es continue to grow, with Instagram commanding nearly a billion monthly active users, WhatsApp recording more than a billion users, and Messenger at more than 900 million users.

Facebook has pledged to more than double its team of 10,000 content moderators by the end of 2018 to keep false and misleading informatio­n in check. But with hundreds of millions of photos, videos and articles uploaded to Facebook daily, security experts question whether this will be enough.

Despite the rampant misinforma­tion on the platform, users flock to it, North said. Policing its platform will be especially hard for Facebook, she said, because the tools used for propaganda — the wealth of informatio­n it collects and its microtarge­ted advertisem­ents — are the same ones Facebook uses to generate revenue.

Gathering and selling access to that kind of granular data helped increase Facebook’s advertisin­g revenue last year by 49%. Advertisin­g accounted for more than 98% of Facebook’s total revenue in 2017, according to company filings.

So despite Facebook’s share price dropping $12.53 on Monday to $172.56 after the Cambridge Analytica allegation­s, multiple analysts maintained a “buy” rating on the company’s stock.

“What’s important to understand is that all social media platforms can be ‘weaponized,’ so this is not limited to Facebook by any means,” analysts at Monness, Crespi, Hardt & Co. said in a note to investors.

Or, as Olin put it: “No one thinks of themselves as a fake news consumer. We all assume we’re smarter than that.”

‘We just put informatio­n into the bloodstrea­m of the internet ... and then watch it grow.’ — Mark Turnbull, managing director at Cambridge Analytica

Newspapers in English

Newspapers from United States