The Guardian Australia

Facebook admits failings over incitement to violence in Myanmar

- Hannah Ellis-Petersen

Facebook has admitted it did not do enough to prevent the incitement of violence and hate speech in Myanmar, after a report it commission­ed concluded that it had become a platform for harmful and racially-inflammato­ry content.

The report by San Francisco-based nonprofit Business for Social Responsibi­lity (BSR) found that, in Myanmar, “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence.”

There are now 20 million Facebook users in Myanmar, and it is used by large numbers of people as their main source of news in the absence of a free media.

But the report concluded that Facebook was being used by “bad actors” to spread hate speech, incite violence, and coordinate harm in Myanmar, echoing findings by civil society and tech groups, some who have been hig- hlighting the issue to the social media giant for over four years.

A large proportion of this hate speech has been directed towards the Rohingya, the Muslim minority in Myanmar.

In April, the Guardian reported that hate speech on Facebook in Myanmar had exploded during the Rohingya crisis, which was caused by a crackdown by the military in Rahkine state in August 2017. Tens of thousands of Rohingya were killed, raped and assaulted, villages were razed to the ground and more than 700,000 Rohingya fled over the border to Bangladesh.

The recent UN fact-finding mission to Myanmar, which concluded a genocide had taken place against the Rohingya in Rahkine, specifical­ly singled out the role of Facebook in fanning the flames of anti-Muslim sentiment and violence.

Alex Warofka, a Facebook product policy manager, said in a blog post that

the report demonstrat­ed that “prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”

The company said they were tackling the problem, this year hiring 100 native Myanmar speakers to review content. The company took action on around 64,000 pieces of content in Myanmar for violating hate speech policies in 2018. They also took down 18 accounts and 52 pages associated with figures in the Myanmar military who were named in the UN fact-finding report as being involved in the genocide and ethnic cleansing in Rahkine.

However, the BSR report made it clear that due to the “complex social and political context of Myanmar” the social media giant did not yet have the problem under control and there was still a “high likelihood” of hate speech being posted on Facebook in Myanmar.

The report said the consequenc­es for the victims of this hate speech on Facebook, they said, was “severe, with lives and bodily integrity placed at risk from incitement to violence.”

One interviewe­e quoted in the report said: “Activists are being harassed, self-censorship exists, and activity on Facebook today is closing freedom of expression, rather than increasing it. One side is shutting down the other, and it is no longer a marketplac­e of ideas.”

In particular, the report highlighte­d the upcoming 2020 general elections in Myanmar as a cause for concern.

“Today’s challengin­g circumstan­ces are likely to escalate in the run-up to the election, and Facebook would be well-served by preparing for multiple eventualit­ies now,” the reports authors warned.

 ??  ?? The report recommende­d Facebook enforce its policies more strictly. Photograph: Danish Siddiqui/Reuters
The report recommende­d Facebook enforce its policies more strictly. Photograph: Danish Siddiqui/Reuters

Newspapers in English

Newspapers from Australia