Hate­book Myan­mar

In a na­tion where eth­nic and re­li­gious ha­treds have long sim­mered and oc­ca­sion­ally boiled over, so­cial me­dia rep­re­sents a brave new dig­i­tal fron­tier where news bat­tles with hearsay, gos­sip and threats that can have deadly con­se­quences

Southeast Asia Globe - - Contents -

Meet the ac­tivists fight­ing against the spread of hate speech through so­cial me­dia

Hate speech isn’t a new phe­nom­e­non in Myan­mar. But it has in­ten­si­fied with the ad­vent of so­cial me­dia, smart­phones and af­ford­able SIM cards. “I’ve been called a kalar-lover [and] had my ad­dress, my ID card, eth­nic­ity, na­tive city, re­li­gion and ad­dress leaked,” said youth ac­tivist Khin San­dar, de­scrib­ing her ex­pe­ri­ences on the re­ceiv­ing end of hate speech on Face­book. She’s had to deal with the panic of re­port­ing daily posts of abuse on the site that in­sult her re­li­gion, dis­close her per­sonal iden­tity and threaten her safety.

Memes call­ing Mus­lims rats or dogs, or us­ing the word kalar – a com­mon racist slur against Mus­lims and In­di­ans – are just some of the ex­am­ples of hate speech that con­tinue to be shared on Face­book, de­spite the scru­tiny of ac­tivists and prom­ises by the plat­form. Per­sonal at­tacks like those Khin San­dar ex­pe­ri­ences are still slip­ping by Face­book’s mon­i­tor­ing team, sev­eral sources have con­firmed to South­east Asia Globe.

Face­book de­fines hate speech in its Com­mu­nity Stan­dards as “a di­rect at­tack on peo­ple based on pro­tected char­ac­ter­is­tics – race, eth­nic­ity, na­tional ori­gin, re­li­gious af­fil­i­a­tion, sex­ual ori­en­ta­tion, sex, gen­der, gen­der iden­tity, and se­ri­ous dis­abil­ity or dis­ease.” But, while its stan­dards are clear, its de­tec­tion of breaches falls short, say crit­ics.

The other ma­jor prob­lem is that so­cial me­dia is a new in­for­ma­tion de­liv­ery tech­nol­ogy with a steep learn­ing curve. In Myan­mar,

“The pop­u­la­tion can read and write but tends to im­me­di­ately be­lieve and un­crit­i­cally re­spond to

that which they see on so­cial me­dia”

Face­book is so dom­i­nant that it’s seen as the in­ter­net it­self, said one ac­tivist. Most Myan­mar peo­ple use Face­book as a search browser and their pri­mary news source. In­ter­net lit­er­acy is ex­tremely low in the coun­try, as a new gen­er­a­tion leapfrogs from a time un­der the mil­i­tary regime when only the elite could af­ford to buy a SIM card that cost up to an as­tound­ing $1,500, to a new gen­er­a­tion of smart­phone users who can now buy SIMs for just $2 each.

Myan­mar is not alone. As one Sri Lankan an­a­lyst told the Guardian, while Sri Lanka en­joys high lit­er­acy, it suf­fers from a lack of in­for­ma­tion lit­er­acy: “It means the pop­u­la­tion can read and write but tends to im­me­di­ately be­lieve and un­crit­i­cally re­spond to that which they see on so­cial me­dia.”

At a peace march in Yan­gon in May, around 300 ac­tivists held signs with doves on them as they called for an end to all mil­i­tary skir­mishes in the coun­try between gov­ern­ment forces and eth­nic mi­nori­ties like the Kachin in north­ern Myan­mar. The demon­stra­tion was shut down af­ter some civil­ian Bud­dhist na­tion­al­ists showed up and started at­tack­ing pro­test­ers as the po­lice looked on. When po­lice fi­nally de­cided to in­ter­vene, it was to ar­rest the pro­test­ers who were un­der at­tack.

Khin San­dar was one of those ac­tivists. She pulled up a screen­shot of a Face­book post show­ing a photo of her be­ing ar­rested by po­lice at the protest. The Burmese com­ment that went with it trans­lated roughly to: “These peo­ple earn dol­lars and are cre­at­ing un­rest by as­so­ci­at­ing with rebels and kalar.”

The post was shared again and again, of­ten with com­ments call­ing for vi­o­lence and even death threats against Khin San­dar.

Khin San­dar said she re­ported the post to Face­book as hate speech. Twenty-four hours af­ter the protest, she checked her phone to find that the post was still up – and was clock­ing hun­dreds of shares. She was scared, so she alerted fam­ily and friends.

“I was at risk and my friends were try­ing to help me,” she said. “Friends were re­port­ing the post through their ac­counts and oth­ers were try­ing to email Face­book ad­min­is­tra­tors di­rectly to alert them to this se­ri­ous case.”

It took 48 hours for Face­book to re­view, re­spond and delete the post. By then, it had been shared nearly 1,900 times. She felt help­less.

Crit­ics say Face­book’s slow re­sponse time to hate speech in­creases the dan­ger to those at which it’s di­rected. Although Face­book CEO Mark Zucker­berg pledged to com­mit to a 24-hour re­view and re­moval of hate speech at the land­mark US Se­nate in­quiry in April, Myan­mar ac­tivists say the av­er­age re­sponse time is closer to 48 hours.

“The prob­lem of hate speech has not gone away,” said Htaike Htaike of the

Yan­gon-based Myan­mar ICT for De­vel­op­ment Or­gan­i­sa­tion (Mido), a group that teaches in­ter­net lit­er­acy and mon­i­tors hate speech. Mido launched its Safe On­line Space (SOS) cur­ricu­lum in 2016 to teach peo­ple the ba­sics of so­cial me­dia and the in­ter­net, how to re­port hate speech on Face­book and how to iden­tify hearsay ver­sus ac­tual news. Through its “train­ing of train­ers”, Mido is de­vel­op­ing a net­work of peo­ple to pass on this knowl­edge across Myan­mar.

Htaike brought out her lap­top and showed the data Myan­mar ac­tivists have been col­lect­ing of se­ri­ous hate speech posts that have been re­ported to Face­book. Next to each post was a col­umn show­ing how many hours it took for it to be re­moved. The most com­mon wait time was 48 hours. But Htaike said these few posts rep­re­sented a tiny frac­tion of the thou­sands of hate-filled posts Mido has tracked.

This slow re­sponse time has not im­proved since March, when the UN called the so­cial me­dia gi­ant a “beast” for its al­leged role in fu­elling the vi­o­lence against the Ro­hingya – in part by al­low­ing hate-speech posts to be shared on­line.

These Face­book posts in­cluded lists of well-known ac­tivists and Mus­lim com­mu­nity lead­ers who were vo­cal on the Ro­hingya is­sue, with links to their Face­book ac­counts. Some posts called for ac­tivists to be as­sas­si­nated. One post de­picted the pur­ported corpse of a Rakhine Bud­dhist woman with ripped clothes ly­ing in the grass, with the sug­ges­tion that she had been raped and mur­dered by Mus­lim men – an old story that led to deadly vi­o­lence in Rakhine State in 2012.

Two chain mes­sages – one to in­cite vi­o­lence against the Mus­lim com­mu­nity and a sep­a­rate mes­sage mir­ror­ing the words of vi­o­lence against the Bud­dhist com­mu­nity – were shared hun­dreds of thou­sands of times lead­ing up to 9 Septem­ber last year. They or­dered peo­ple to take up arms, warned of ji­had and en­cour­aged an anti-kalar move­ment.

Since the cri­sis that led to over 700,000 Ro­hingya flee­ing from Rakhine State to Bangladesh – de­scribed as eth­nic cleans­ing by the UN – hate speech only con­tin­ues to es­ca­late.

“We know we’ve been too slow to re­spond to the de­vel­op­ing sit­u­a­tion in Myan­mar,” ad­mit­ted David Caragliano, a Face­book con­tent pol­icy man­ager, dur­ing the com­pany’s first visit to Myan­mar, in May.

A team of five Face­book staff vis­ited the coun­try for one week for a whirl­wind of meet­ings with Myan­mar-based civil so­ci­ety or­gan­i­sa­tions, ac­tivists and the gov­ern­ment.

The Face­book rep in­sisted the com­pany is im­prov­ing as a hate-speech watch­dog. Re­fer­ring to the same protest in Yan­gon, Caragliano vaguely sug­gested they were able to re­spond to some re­ports dur­ing the protests within 24 hours: “We re­moved nu­mer­ous pieces of threat­en­ing con­tent to­wards ac­tivists within three hours and a video de­pict­ing graphic vi­o­lence.”

These ex­am­ples are not enough, said Burmese ac­tivists.

“Two out of how many hate-speech posts?” asked Ei Myat Noe Khin of the Yan­gon-based tech ac­cel­er­a­tor Phan­dee­yar, which helped Face­book trans­late its Burmese-lan­guage com­mu­nity stan­dards. De­spite her group’s re­peated re­quests, she said Face­book had not shared any in­for­ma­tion or met­rics with Myan­mar groups to show how it is mon­i­tor­ing posts – or how many posts it had re­moved and how quickly.

Herein lies the prob­lem at the heart of the hate: no ev­i­dence or trans­parency of coun­try-spe­cific Face­book re­ports. So ac­tivists from Sri Lanka, Vietnam, In­dia, Syria, Ethiopia and Myan­mar formed the Global South coali­tion in May to hold Face­book ac­count­able for fail­ing to put ad­e­quate pro­tec­tions in place.

Wi­rathu,Myan­mar’smost notorious hard­line Bud­dhist na­tion­al­ist monk – who has been charged with in­cit­ing anti-Mus­lim ri­ots – fi­nally had his Face­book ac­count sus­pended in late Jan­uary this year af­ter he had been re­peat­edly caught breach­ing Face­book’s com­mu­nity stan­dards with hate-filled posts.

Yet Htaike Htaike said mon­i­tor­ing by Mido has found that bad ac­tors like Wi­rathu are still ac­tive on the plat­form, some­times us­ing fake names. And even if Wi­rathu and his ilk are locked out, they have plenty of fol­low­ers to carry their hate torches high. Wi­rathu’s hard­line na­tion­al­ist group, the Pa­tri­otic As­so­ci­a­tion of Myan­mar, ab­bre­vi­ated in Burmese as Ma Ba Tha, thrives on Face­book pages and in mul­ti­ple groups.

Re­port­ing and delet­ing of posts is not enough, say ac­tivists – bet­ter de­tec­tion is sorely needed. Face­book’s Caragliano said the plat­form is get­ting tougher in its ap­proach to hate speech by im­ple­ment­ing sys­tems to “proac­tively de­tect this kind of con­tent”, but Myan­mar cy­ber­se­cu­rity groups say they are still see­ing these black­listed fig­ures on­line.

Caragliano ad­mit­ted it is hard for ar­ti­fi­cial in­tel­li­gence to iden­tify hate speech with the same pre­ci­sion with which the plat­form iden­ti­fies nu­dity, ter­ror­ist pro­pa­ganda and spam. In Myan­mar, the chal­lenge for Face­book to build sys­tems that de­tect hate speech in the Burmese script-based Zaw­gyi One and uni­code fonts isn’t as sim­ple as tak­ing down posts with key words.

When Face­book sud­denly banned the slur

kalar last year, Htaike Htaike said it was a Band-Aid so­lu­tion: “When they brought in the blan­ket ban of the word kalar, it be­came a joke be­cause it didn’t work.” That ban didn’t es­tab­lish a process, Htaike ex­plained. It was a short­sighted de­ci­sion that meant phrases like kalar page (lentil beans) were also sud­denly taken down. Face­book said its re­view sys­tem now con­sid­ers con­text.

An­other road­block is de­tect­ing hate speech on closed pages and in pri­vate mes­sages. One im­prove­ment Face­book has

“We re­moved nu­mer­ous pieces of threat­en­ing con­tent to­wards ac­tivists within three hours and a

video de­pict­ing graphic vi­o­lence”

Dur­ing a peace demon­stra­tion in Yan­gon in May, pro­test­ers were tar­geted by Face­book posts call­ing for vi­o­lence against them

SRo­hingya Mus­lims flee Myan­mar dur­ing 2017's waveof vi­o­lence against them

Bud­dhist na­tion­al­ists protest out­side a Yan­gon court­housedur­ing a trial of their fel­low na­tion­al­ists on June 2, 2017 Bud­dhist monks who sup­port Ma Ba Tha at­tend a cel­e­bra­tion in Yan­gon of the 2014 es­tab­lish­ment of four con­tro­ver­sial bills de­criedby rights groups as aimed at dis­crim­i­nat­ing against thecoun­try's Mus­lim mi­nor­ity

Newspapers in English

Newspapers from Cambodia

© PressReader. All rights reserved.