Houston Chronicle

HOW FICTION BECOMES FACT ON SOCIAL MEDIA

It’s the interactio­n of technology with our biases that make us vulnerable

-

H OURS after the Las Vegas massacre, Travis McKinney’s Facebook feed was hit with a scattersho­t of conspiracy theories. The police were lying. There were multiple shooters in the hotel, not just one. The sheriff was covering for casino owners to preserve their business.

The political rumors sprouted soon after, like digital weeds. The killer was anti-Trump, an “antifa” activist, said some; others made the opposite claim, that he was an altright terrorist. The two unsupporte­d narratives ran into the usual stream of chatter, news and selfies.

“This stuff was coming in from all over my network of 300 to 400” friends and followers, said McKinney, 52, of Suffolk, Va., and some posts were from his inner circle.

But he knew there was only one shooter; a handgun instructor and defense contractor, he had been listening to the police scanner in Las Vegas with an app. “I jumped online and tried to counter some of this nonsense,” he said.

In the recent weeks, executives from Facebook and Twitter have appeared before congressio­nal committees to answer questions about the use of their platforms by Russian hackers and others to spread misinforma­tion and skew elections. During the 2016 presidenti­al campaign, Facebook sold more than $100,000 worth of ads to a Kremlin-linked company, and Google sold more than $4,500 worth to accounts thought to be connected to the Russian government.

Agents with links to the Russian government set up an endless array of fake accounts and websites and purchased a slew of advertisem­ents on Google and Facebook, spreading dubious claims that seemed intended to sow division all along the political spectrum — “a cultural hack,” in the words of one expert.

Yet the psychology behind social media platforms — the dynamics that make them such powerful vectors of misinforma­tion in the first place — is at least as important, experts say, especially for those who think they’re immune to being duped. For all the suspicions about social media companies’ motives and ethics, it is the interactio­n of the technology with our common, often subconscio­us psychologi­cal biases that makes so many of us vulnerable to misinforma­tion, and this has largely escaped notice.

Skepticism of online “news” serves as a decent filter much of the time, but our innate biases allow it to be bypassed, researcher­s have found — especially when presented with the right kind of algorithmi­cally selected “meme.”

At a time when political misinforma­tion is in ready supply, and in demand, “Facebook, Google and Twitter function as a distributi­on mechanism, a platform for circulatin­g false informatio­n and helping find receptive audiences,” said Brendan Nyhan, a professor of government at Dartmouth College.

For starters, said Colleen Seifert, a professor of psychology at the University of Michigan, “People have a benevolent view of Facebook, for instance, as a curator, but in fact it does have a motive of its own. What it’s actually doing is keeping your eyes on the site. It’s curating news and informatio­n that will keep you watching.”

That kind of curating acts as a fertile host for falsehoods by simultaneo­usly engaging two predigital social-science standbys: the urban myth as “meme,” or viral idea; and individual biases, the automatic, subconscio­us presumptio­ns that color belief. The first process is largely datadriven, experts said, and built into social media algorithms. The wide circulatio­n of bizarre, easily debunked rumors — so-called Pizzagate, for example, the canard that Hillary Clinton was running a child sex ring from a Washington­area pizza parlor — is not entirely dependent on partisan fever (though that was its origin).

For one, the common wisdom that these rumors gain circulatio­n because most people conduct their digital lives in echo chambers or “informatio­n cocoons” is exaggerate­d, Nyhan said.

In a forthcomin­g paper, Nyhan and colleagues review the relevant research, including analyses of partisan online news sites and Nielsen data, and find the opposite. Most people are more omnivorous than presumed; they are not confined in warm bubbles containing only agreeable outrage.

But they don’t have to be for fake news to spread fast, research also suggests. Social media algorithms function at one level like evolutiona­ry selection: Most lies and false rumors go nowhere, but the rare ones with appealing urban-myth “mutations” find psychologi­cal traction, then go viral.

There is no precise formula for such digital catnip. The point, experts said, is that the very absurdity of the Pizzagate lie could have boosted its early prominence, no matter the politics of those who shared it.

“My experience is that once this stuff gets going, people just pass these stories on without even necessaril­y stopping to read them,” McKinney said. “They’re just participat­ing in the conversati­on without stopping to look hard” at the source.

Digital social networks are “dangerousl­y effective at identifyin­g memes that are well adapted to surviving, and these also tend to be the rumors

and conspiracy theories that are hardest to correct,” Nyhan said.

One reason is the raw pace of digital informatio­n sharing, he said: “The networks make informatio­n run so fast that it outruns fact-checkers’ ability to check it. Misinforma­tion spreads widely before it can be downgraded in the algorithms.”

The extent to which Facebook and other platforms function as “marketers” of misinforma­tion, similar to the way they market shoes and makeup, is contentiou­s. In 2015, a trio of behavior scientists working at Facebook inflamed the debate in a paper published in the prominent journal Science.

The authors analyzed the news feeds of some 10 million users in the U.S. who posted their political views, and concluded that “individual­s’ choices played a stronger role in limiting exposure” to contrary news and commentary than Facebook’s own algorithmi­c ranking — which gauges how interestin­g stories are likely to be to individual users, based on data they have provided. Outside critics lashed the study as

selfservin­g, while other researcher­s said the analysis was solid and without apparent bias.

The other dynamic that works in favor of proliferat­ing misinforma­tion is not embedded in the software but in the biological hardware: the cognitive biases of the human brain.

Purely from a psychologi­cal point of view, subtle individual biases are at least as important as rankings and choice when it comes to spreading bogus news or Russian hoaxes — like a false report of Muslim men in Michigan collecting welfare for multiple wives. Merely understand­ing what a news report or commentary is saying

requires a temporary suspension of disbelief. Mentally, the reader must temporaril­y accept the stated “facts” as possibly true. A cognitive connection is made automatica­lly: Clinton-sex offender, Trump-Nazi, Muslim menwelfare.

And refuting those false claims requires a person to first mentally articulate them, reinforcin­g a subconscio­us connection that lingers far longer than people presume.

Over time, for many people, it is that false initial connection that stays the strongest, not the retraction­s or correction­s: “Was Obama a Muslim? I seem to remember that …”

In a recent analysis of the biases that help spread misinforma­tion, Seifert and co-authors named this and several other automatic cognitive connection­s that can buttress false informatio­n.

Another is repetition: Merely seeing a news headline multiple times in a news feed makes it seem more credible before it is ever read carefully, even if it’s a fake item being whipped around by friends as a joke.

And, as salespeopl­e have known forever, people tend to value the informatio­n and judgments offered by good friends over all other sources. It’s a psychologi­cal tendency with significan­t consequenc­es now that nearly two-thirds of Americans get at least some of their news from social media.

“Your social alliances affect how you weight informatio­n,” Seifert said. “We overweight informatio­n from people we know.”

The casual, social, wisecracki­ng nature of thumbing through and participat­ing in the digital exchanges allows these biases to operate all but unchecked, Seifert said.

Stopping to drill down and determine the true source of a foul-smelling story can be tricky, even for the motivated skeptic, and mentally it’s hard work. Ideologica­l leanings and viewing choices are conscious, downstream factors that come into play only after automatic cognitive biases have already had their way, abetted by the algorithms and social nature of digital interactio­ns.

“If I didn’t have direct evidence that all these theories were wrong” from the scanner, McKinney said, “I might have taken them a little more seriously.”

 ??  ??
 ?? Stephen Savage illustrati­ons / The New York Times ?? For all the suspicions about social media companies’ motives and ethics, it is the interactio­n of the technology with our common, often subconscio­us psychologi­cal biases that makes so many of us vulnerable to misinforma­tion, and this has largely...
Stephen Savage illustrati­ons / The New York Times For all the suspicions about social media companies’ motives and ethics, it is the interactio­n of the technology with our common, often subconscio­us psychologi­cal biases that makes so many of us vulnerable to misinforma­tion, and this has largely...

Newspapers in English

Newspapers from United States