Bangkok Post

Facebook’s demons reappear with Alex Jones

The platform has proved unable to stop or fully address the spread of paranoia, hatred and misinforma­tion, writes Max Fisher

-

To Americans, Facebook’s problem with Alex Jones might seem novel, even unpreceden­ted. When does speech become unsafe? When can it be limited? Should those decisions be up to a private company at all? And if a company shies away from acting, as Facebook did with Jones until Apple moved first, where does that leave the rest of us?

But to activists and officials in the developing world, both the problem and Facebook’s muddled solutions will be old news.

Before there was Alex Jones, the US conspiracy theorist, there was Amith Weerasingh­e, the Sri Lankan extremist who used Facebook as his personal broadcast station.

Mr Weerasingh­e leveraged Facebook’s newsfeed to spread paranoia and hatred of the country’s Muslim minority. He enjoyed near-total freedom on the platform, despite repeated pleas from activists and officials for the company to intervene, right up until his arrest on charges of inciting a riot that killed one Muslim and left many more homeless.

Before there was Mr Weerasingh­e, there was Ashin Wirathu, the Myanmar extremist, whose Facebook hoaxes incited riots in 2014. Three years later, Mr Wirathu would contribute to a wave of Facebook-based rumours and hate speech that helped inspire widespread violence against Myanmar’s Rohingya minority.

And so on.

“Facebook doesn’t get that they’re the largest news agency in the world,” Harindra Dissanayak­e, a Sri Lankan official, said a few days after Mr Weerasingh­e’s arrest.

The problem, he said, goes beyond a few under-regulated extremists. It also involves the algorithm-driven newsfeed that is core to the company’s business model. “They are blind to seeing the real repercussi­ons,” Mr Dissanayak­e said of Facebook’s leaders.

Developing countries’ experience­s with Facebook suggest that the company, however noble its intent, has set in motion a series of problems we are only beginning to understand and that the company has proved unable or unwilling to fully address:

Reality-distorting misinforma­tion that can run rampant on the newsfeed, which promotes content that will reliably engage users.

Extremism and hate speech that tap into users’ darkest impulses and polarise politics.

Malicious actors granted near-limitless reach on one of the most sophistica­ted communicat­ions platforms in history, relatively unchecked by social norms or traditiona­l gatekeeper­s.

And a private company uneager to wade into contentiou­s debates, much less pick winners and losers.

Facebook — and many Westerners — have long treated those issues as safely “over there”, meaning in countries with weaker institutio­ns, lower literacy rates and more recent histories of racial violence. Last month, a company official, announcing new policies to restrict speech that leads to violence, referred to “a type of misinforma­tion that is shared in certain countries”.

But chillingly similar, Facebook-linked problems are becoming increasing­ly visible in wealthy, developed countries like the United States. So is the difficulty of solving those problems — and the consequenc­es of Facebook’s preference for action that can be incrementa­l, reactive and agonisingl­y slow.

‘SOMETHING BAD COULD HAPPEN’

Although Facebook officials often portray the violence associated with it as new or impossible to predict, the incidents date to at least 2012. So does the pressure to more actively regulate speech on the platform.

That year, fake reports of sectarian violence went viral in India, setting off riots that killed several people and displaced thousands. Indian officials put so much pressure on Facebook to remove the posts that US officials publicly intervened in the company’s defence.

Reports of Facebook-linked violence only grew in India, and as Facebook expanded to other developing countries, similar stories followed.

“I think in the back deep-deep recesses of our minds, we kind of knew something bad could happen,” Chamath Palihapiti­ya, a senior executive who left Facebook in 2011, said at a policy conference last year. “We have created tools that are ripping apart the social fabric of how society works.”

There were other warnings, typically from activists or civil society leaders in the developing countries where Facebook’s expansion was fastest and most obviously disruptive. But they were little heeded.

“Facebook is the platform that we could not meet with for years,” Damar Juniarto, who leads an Indonesian organisati­on that tracks online hate groups, told me in March.

As a Facebook-based group called the Muslim Cyber Army organised increasing­ly elaborate real-world attacks, Mr Juniarto said, Facebook proved unresponsi­ve. “How are we supposed to do this?” members of his group wondered. “Is it a form? Do we email them? We want them to tell us.”

Facebook representa­tives eventually met with Mr Juniarto, and the company has shut most pages associated with the Muslim Cyber Army.

Still, the episode seems to fit a pattern of Facebook waiting to respond until after a major disruption: an organised lynching, a sectarian riot, state-sponsored election meddling or, as with the so-called Pizzagate rumour pushed by Jones, a violent close call set off by misinforma­tion.

A CORPORATE REGULATOR OF PUBLIC LIFE

In the developing countries where such incidents seem most common, or at least most explicitly violent, Facebook simply faces little pressure to act.

In Sri Lanka, government officials spoke of the company as if it were a superpower to be feared and appeased.

Tellingly, Facebook grew more proactive in Myanmar only after the United Nations and Western organisati­ons accused it of having played a role in spreading the hate and misinforma­tion that contribute­d to acts of ethnic cleansing.

Even officials in India, a major power, struggled to get the company to listen. Indian pressure on Facebook, however, has dropped since the arrival of new government leaders who rose, in part, on a Hindu nationalis­t wave still prevalent on social media.

US officials have far greater leverage over Facebook, as members of Congress proved when lawmakers summoned Mark Zuckerberg, its chief executive officer, to testify in April. But the Americans seem unsure what they want Facebook to do or how to compel it to act. So they, too, are not very effective at changing the company’s behavior.

More broadly, Americans seem unsure precisely how far Facebook should go in regulating speech on the platform, or what it should do about the data suggesting that misinforma­tion is more common on the political right.

All of which comes through in Facebook’s hesitation about shutting down Jones’ page, despite his long record of demonstrab­le falsehoods that have realworld consequenc­es.

MOVE FAST AND BREAK THINGS

There are growing indication­s Facebook’s problems in rich countries may go beyond misinforma­tion to do the kind of harm developing countries have experience­d.

Karolin Schwarz, who runs a Berlinbase­d organisati­on that tracks social media misinforma­tion, said she believed Facebook-based rumours about refugees could be fuelling the spate of hate crimes against them.

“I think it does something to their sense of community,” she said. “These things, if they reach thousands of people, you cannot get it back.”

The platform has grown so powerful, so quickly, that we are still struggling to understand its influence. Social scientists regularly discover new ways that Facebook alters the societies where it operates: a link to hate crimes, a rise in extremism, a distortion of social norms.

After all, Jones, for all his demagogic skills, was tapping into misinforma­tion and paranoia already on the platform.

 ?? NYT ?? Alex Jones, the conservati­ve host of Infowars.com, is seen in his Austin control room in February last year. In 2018, Apple, Facebook, YouTube and Spotify removed large portions of content posted by Jones in what is seen as a major step to curb one of the most prominent online voices traffickin­g what they deem as misinforma­tion.
NYT Alex Jones, the conservati­ve host of Infowars.com, is seen in his Austin control room in February last year. In 2018, Apple, Facebook, YouTube and Spotify removed large portions of content posted by Jones in what is seen as a major step to curb one of the most prominent online voices traffickin­g what they deem as misinforma­tion.

Newspapers in English

Newspapers from Thailand