With Alex Jones, FB’s worst demons abroad begin to come home
To Americans, Facebook’s Alex Jones problem might seem novel, even unprecedented.
When does speech become unsafe? When can it be limited? Should those decisions be up to a private company at all? And if a company shies away from acting, as Facebook did with Mr Jones until Apple moved first, where does that leave the rest of us?
But to activists and officials in much of the developing world, both the problem and Facebook’s muddled solutions will be old news.
Before there was Alex Jones, the American conspiracy theorist, there was Amith Weerasinghe, the Sri Lankan extremist who used Facebook as his personal broadcast station.
Mr Weerasinghe leveraged Facebook’s newsfeed to spread paranoia and hatred of the country’s Muslim minority. He enjoyed neartotal freedom on the platform, despite repeated pleas from activists and officials for the company to intervene, right up until his arrest on charges of inciting a riot that killed one Muslim and left many more homeless.
Before there was Mr Weerasinghe, there was Ashin Wirathu, the Myanmar extremist, whose Facebook hoaxes incited riots in 2014. Three years later, Mr. Wirathu would contribute to a wave of Facebook-based rumors and hate speech that helped inspire widespread violence against Myanmar’s Rohingya minority. And so on.
“Facebook doesn’t seem to get that they’re the largest news agency in the world,” Harindra Dissanayake, a Sri Lankan official, said a few days after Mr Weerasinghe’s arrest.
The problem, he said, goes beyond a few underregulated extremists. It also involves the algorithm-driven newsfeed that is core to the company’s business model. “They are blind to seeing the real repercussions,” Mr Dissanayake said of Facebook’s leaders.
Developing countries’ experiences with Facebook suggest that the company, however noble its intent, has set in motion a series of problems we are only beginning to understand and that the company has proved unable or unwilling to fully address:
Reality-distorting misinformation that can run rampant on the newsfeed, which promotes content that will reliably engage users.
Extremism and hate speech that tap into users’ darkest impulses, and polarize politics.
Malicious actors granted near-limitless reach on one of the most sophisticated communications platforms in history, relatively unchecked by social norms or traditional gatekeepers.
And a private company uneager to wade into contentious debates, much less pick winners and losers.
Facebook — and many Westerners — have long treated those issues as safely “over there,” meaning in countries with weaker institutions, lower literacy rates and more recent histories of racial violence. Last month, a company official, announcing new policies to restrict speech that leads to violence, referred to “a type of misinformation that is shared in certain countries.”
But chillingly similar Facebooklinked problems are becoming increasingly visible in wealthy, developed countries like the United States. So is the difficulty of solving those problems — and the consequences of Facebook’s preference for action that can be incremental, reactive and agonisingly slow.