In a nation where ethnic and religious hatreds have long simmered and occasionally boiled over, social media represents a brave new digital frontier where news battles with hearsay, gossip and threats that can have deadly consequences
Meet the activists fighting against the spread of hate speech through social media
Hate speech isn’t a new phenomenon in Myanmar. But it has intensified with the advent of social media, smartphones and affordable SIM cards. “I’ve been called a kalar-lover [and] had my address, my ID card, ethnicity, native city, religion and address leaked,” said youth activist Khin Sandar, describing her experiences on the receiving end of hate speech on Facebook. She’s had to deal with the panic of reporting daily posts of abuse on the site that insult her religion, disclose her personal identity and threaten her safety.
Memes calling Muslims rats or dogs, or using the word kalar – a common racist slur against Muslims and Indians – are just some of the examples of hate speech that continue to be shared on Facebook, despite the scrutiny of activists and promises by the platform. Personal attacks like those Khin Sandar experiences are still slipping by Facebook’s monitoring team, several sources have confirmed to Southeast Asia Globe.
Facebook defines hate speech in its Community Standards as “a direct attack on people based on protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease.” But, while its standards are clear, its detection of breaches falls short, say critics.
The other major problem is that social media is a new information delivery technology with a steep learning curve. In Myanmar,
“The population can read and write but tends to immediately believe and uncritically respond to
that which they see on social media”
Facebook is so dominant that it’s seen as the internet itself, said one activist. Most Myanmar people use Facebook as a search browser and their primary news source. Internet literacy is extremely low in the country, as a new generation leapfrogs from a time under the military regime when only the elite could afford to buy a SIM card that cost up to an astounding $1,500, to a new generation of smartphone users who can now buy SIMs for just $2 each.
Myanmar is not alone. As one Sri Lankan analyst told the Guardian, while Sri Lanka enjoys high literacy, it suffers from a lack of information literacy: “It means the population can read and write but tends to immediately believe and uncritically respond to that which they see on social media.”
At a peace march in Yangon in May, around 300 activists held signs with doves on them as they called for an end to all military skirmishes in the country between government forces and ethnic minorities like the Kachin in northern Myanmar. The demonstration was shut down after some civilian Buddhist nationalists showed up and started attacking protesters as the police looked on. When police finally decided to intervene, it was to arrest the protesters who were under attack.
Khin Sandar was one of those activists. She pulled up a screenshot of a Facebook post showing a photo of her being arrested by police at the protest. The Burmese comment that went with it translated roughly to: “These people earn dollars and are creating unrest by associating with rebels and kalar.”
The post was shared again and again, often with comments calling for violence and even death threats against Khin Sandar.
Khin Sandar said she reported the post to Facebook as hate speech. Twenty-four hours after the protest, she checked her phone to find that the post was still up – and was clocking hundreds of shares. She was scared, so she alerted family and friends.
“I was at risk and my friends were trying to help me,” she said. “Friends were reporting the post through their accounts and others were trying to email Facebook administrators directly to alert them to this serious case.”
It took 48 hours for Facebook to review, respond and delete the post. By then, it had been shared nearly 1,900 times. She felt helpless.
Critics say Facebook’s slow response time to hate speech increases the danger to those at which it’s directed. Although Facebook CEO Mark Zuckerberg pledged to commit to a 24-hour review and removal of hate speech at the landmark US Senate inquiry in April, Myanmar activists say the average response time is closer to 48 hours.
“The problem of hate speech has not gone away,” said Htaike Htaike of the
Yangon-based Myanmar ICT for Development Organisation (Mido), a group that teaches internet literacy and monitors hate speech. Mido launched its Safe Online Space (SOS) curriculum in 2016 to teach people the basics of social media and the internet, how to report hate speech on Facebook and how to identify hearsay versus actual news. Through its “training of trainers”, Mido is developing a network of people to pass on this knowledge across Myanmar.
Htaike brought out her laptop and showed the data Myanmar activists have been collecting of serious hate speech posts that have been reported to Facebook. Next to each post was a column showing how many hours it took for it to be removed. The most common wait time was 48 hours. But Htaike said these few posts represented a tiny fraction of the thousands of hate-filled posts Mido has tracked.
This slow response time has not improved since March, when the UN called the social media giant a “beast” for its alleged role in fuelling the violence against the Rohingya – in part by allowing hate-speech posts to be shared online.
These Facebook posts included lists of well-known activists and Muslim community leaders who were vocal on the Rohingya issue, with links to their Facebook accounts. Some posts called for activists to be assassinated. One post depicted the purported corpse of a Rakhine Buddhist woman with ripped clothes lying in the grass, with the suggestion that she had been raped and murdered by Muslim men – an old story that led to deadly violence in Rakhine State in 2012.
Two chain messages – one to incite violence against the Muslim community and a separate message mirroring the words of violence against the Buddhist community – were shared hundreds of thousands of times leading up to 9 September last year. They ordered people to take up arms, warned of jihad and encouraged an anti-kalar movement.
Since the crisis that led to over 700,000 Rohingya fleeing from Rakhine State to Bangladesh – described as ethnic cleansing by the UN – hate speech only continues to escalate.
“We know we’ve been too slow to respond to the developing situation in Myanmar,” admitted David Caragliano, a Facebook content policy manager, during the company’s first visit to Myanmar, in May.
A team of five Facebook staff visited the country for one week for a whirlwind of meetings with Myanmar-based civil society organisations, activists and the government.
The Facebook rep insisted the company is improving as a hate-speech watchdog. Referring to the same protest in Yangon, Caragliano vaguely suggested they were able to respond to some reports during the protests within 24 hours: “We removed numerous pieces of threatening content towards activists within three hours and a video depicting graphic violence.”
These examples are not enough, said Burmese activists.
“Two out of how many hate-speech posts?” asked Ei Myat Noe Khin of the Yangon-based tech accelerator Phandeeyar, which helped Facebook translate its Burmese-language community standards. Despite her group’s repeated requests, she said Facebook had not shared any information or metrics with Myanmar groups to show how it is monitoring posts – or how many posts it had removed and how quickly.
Herein lies the problem at the heart of the hate: no evidence or transparency of country-specific Facebook reports. So activists from Sri Lanka, Vietnam, India, Syria, Ethiopia and Myanmar formed the Global South coalition in May to hold Facebook accountable for failing to put adequate protections in place.
Wirathu,Myanmar’smost notorious hardline Buddhist nationalist monk – who has been charged with inciting anti-Muslim riots – finally had his Facebook account suspended in late January this year after he had been repeatedly caught breaching Facebook’s community standards with hate-filled posts.
Yet Htaike Htaike said monitoring by Mido has found that bad actors like Wirathu are still active on the platform, sometimes using fake names. And even if Wirathu and his ilk are locked out, they have plenty of followers to carry their hate torches high. Wirathu’s hardline nationalist group, the Patriotic Association of Myanmar, abbreviated in Burmese as Ma Ba Tha, thrives on Facebook pages and in multiple groups.
Reporting and deleting of posts is not enough, say activists – better detection is sorely needed. Facebook’s Caragliano said the platform is getting tougher in its approach to hate speech by implementing systems to “proactively detect this kind of content”, but Myanmar cybersecurity groups say they are still seeing these blacklisted figures online.
Caragliano admitted it is hard for artificial intelligence to identify hate speech with the same precision with which the platform identifies nudity, terrorist propaganda and spam. In Myanmar, the challenge for Facebook to build systems that detect hate speech in the Burmese script-based Zawgyi One and unicode fonts isn’t as simple as taking down posts with key words.
When Facebook suddenly banned the slur
kalar last year, Htaike Htaike said it was a Band-Aid solution: “When they brought in the blanket ban of the word kalar, it became a joke because it didn’t work.” That ban didn’t establish a process, Htaike explained. It was a shortsighted decision that meant phrases like kalar page (lentil beans) were also suddenly taken down. Facebook said its review system now considers context.
Another roadblock is detecting hate speech on closed pages and in private messages. One improvement Facebook has
“We removed numerous pieces of threatening content towards activists within three hours and a
video depicting graphic violence”
During a peace demonstration in Yangon in May, protesters were targeted by Facebook posts calling for violence against them
SRohingya Muslims flee Myanmar during 2017's waveof violence against them
Buddhist nationalists protest outside a Yangon courthouseduring a trial of their fellow nationalists on June 2, 2017 Buddhist monks who support Ma Ba Tha attend a celebration in Yangon of the 2014 establishment of four controversial bills decriedby rights groups as aimed at discriminating against thecountry's Muslim minority