South Florida Sun-Sentinel Palm Beach (Sunday)

Hate still has a home online

Files show Facebook fails to curb speech, threats of ethnic and religious violence in Myanmar

- By Sam McNeil and Victoria Milko

JAKARTA, Indonesia — Years after coming under scrutiny for contributi­ng to ethnic and religious violence in Myanmar, Facebook still has problems detecting and moderating hate speech and misinforma­tion on its platform in the Southeast Asian nation, internal documents viewed by Associated Press show.

Three years ago, the company commission­ed a report that found Facebook was used to “foment division and incite offline violence” in the country. It pledged to do better and developed several tools and policies to deal with hate speech.

But the breaches have persisted — and even been exploited by hostile actors — since the Feb. 1 military takeover that resulted in gruesome human rights abuses across the country.

Scrolling through Facebook today, it’s not hard to find posts threatenin­g murder and rape in Myanmar.

One 2 ½ minute video posted Oct. 24 of a supporter of the military calling for violence against opposition groups has garnered over 56,000 views.

“So starting from now, we are the god of death for all (of them),” the man says in Burmese.

One account posts the home address of a military defector and a photo of his wife. Another post from Oct. 29 includes a photo of soldiers leading bound and blindfolde­d men down a dirt path. The Burmese caption reads, “Don’t catch them alive.”

Despite t he ongoing issues, Facebook saw its operations in Myanmar as both a model to export around the world and an evolving and caustic case. Documents show Myanmar became a testing ground for new content moderation technology, with the social media giant trying out ways to automate the detection of hate speech and misinforma­tion with varying levels of success.

Fa cebook’s internal discussion­s on Myanmar were revealed in disclosure­s made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblo­wer Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizati­ons, including Associated Press.

Facebook has had a shorter but more volatile history in Myanmar than in most countries. After decades of censorship under military rule, Myanmar was connected to the internet in 2000. Shortly afterward, Facebook paired with telecom providers in the country, allowing customers to use the platform without needing to pay for the data, which was still expensive at the time. Use of the platform exploded. For many in Myanmar, Facebook became the internet itself.

A ‘hotbed’

Htaike Htaike Aung, a Myanmar internet policy advocate, said it also became “a hotbed for extremism” around 2013, coinciding with religious riots across Myanmar between Buddhists and Muslims. It’s unclear how much, if any, content moderation was happening at the time by people or automation.

Htaike Htaike Aung said she met with Facebook that year and laid out issues, including how local organizati­ons were seeing exponentia­l amounts of hate speech on the platform and how its preventive mechanisms, such as reporting posts, didn’t work in the Myanmar context.

One example she cited was a photo of a pile of bamboo sticks that was posted with a caption reading, “Let us be prepared because there’s going to be a riot that is going to happen within the Muslim community.”

Htaike Htaike Aung said the photo was reported to Facebook, but the company said it didn’t violate community standards.

“Which is ridiculous because it was actually calling for violence. But Facebook didn’t see it that way,” she said.

Years later, the lack of moderation caught the attention of the internatio­nal community. In March 2018, United Nations human rights experts investigat­ing attacks against Myanmar’s Muslim Rohingya minority said Facebook had played a role in spreading hate speech.

When asked about Myanmar a month later during a U.S. Senate hearing, CEO Mark Zuckerberg replied that Facebook planned to hire “dozens” of Burmese speakers to moderate content, would work with civil society groups to identify hate figures and develop new technologi­es to combat hate speech.

“Hate speech is very language specific. It’s hard to do it without people who speak the local language and we need to ramp up our effort there dramatical­ly,” Zuckerberg said.

Informatio­n in internal Facebook documents show that while the company did step up efforts to combat hate speech in the country, the tools and strategies to do so never came to full fruition, and individual­s within the company repeatedly sounded the alarm. In one document from May 2020, an employee said a hate speech text classifier that was available wasn’t being used or maintained. Another document from a month later said there were “significan­t gaps” in misinforma­tion detection in Myanmar.

“Facebook took symbolic actions I think were designed to mollify policymake­rs that something was being done and didn’t need to look much deeper,” said Ronan Lee, a visiting scholar at Queen Mary University of London’s Internatio­nal State Crime Initiative.

In an emailed statement, Rafael Frankel’s, Facebook’s director of policy for APAC Emerging Countries, said the platform “has built a dedicated team of over 100 Burmese speakers.” He declined to state exactly how many were employed. Online marketing company NapoleonCa­t estimates there are about 28.7 million Facebook users in Myanmar.

During her testimony to the European Union Parliament on Nov. 8, Haugen criticized Facebook for a lack of investment in third-party fact-checking, and relying instead on automatic systems to detect harmful content.

“If you focus on these automatic systems, they will not work for the most ethnically diverse places in the world, with linguistic­ally diverse places in the world, which are often the most fragile,” she said while referring to Myanmar.

After Zuckerberg’s 2018 congressio­nal testimony, Facebook developed digital tools to combat hate speech and misinforma­tion and also created a new internal framework to manage crises like Myanmar around the world.

Lists and tiers

Facebook crafted a list of “at-risk countries” with ranked tiers for a “critical countries team” to focus its energy on, and also rated languages needing more content moderation. Myanmar was listed as a “Tier 1” at-risk country, with Burmese deemed a “priority language” alongside Ethiopian languages, Bengali, Arabic and Urdu.

Facebook engineers taught Burmese slang words for “Muslims” and “Rohingya” to its automated systems. They also trained systems to detect “coordinate­d inauthenti­c behavior” such as a single person posting from multiple accounts, or coordinati­on between different accounts to post the same content.

The company also tried “repeat offender demotion” which lessened the impact of posts of users who frequently violated guidelines. In a test in two of the world’s most volatile countries, demotion worked well in Ethiopia, but poorly in Myanmar — a difference that flummoxed engineers, according to a 2020 report included in the documents.

“We aren’t sure why . but this informatio­n provides a starting point for further analysis and user research,” the report said. Facebook declined to comment on the record if the problem has been fixed a year after its detection, or about the success of the two tools in Myanmar.

The company also deployed a new tool to reduce the virality of content called “reshare depth promotion” that boosts content shared by direct contacts, according to an internal 2020 report. This method is “content-agnostic” and cut viral inflammato­ry prevalence by 25% and photo misinforma­tion by 48.5%, it said.

Slur detection and demotion were judged effective enough that staffers shared the experience in Myanmar as part of a “playbook” for acting in other at-risk countries such as Ethiopia, Syria, Yemen, Pakistan, India, Russia, the Philippine­s and Egypt.

While these new methods forged in Myanmar’s civil crises were deployed around the world, documents show that by June 2020 Facebook knew that flaws persisted in its Myanmar safety work.

“We found significan­t gaps in our coverage (especially in Myanmar and Ethiopia), showcasing that our current signals may be inadequate,” said an internal audit of the company’s “integrity coverage.” Myanmar was color-coded red with less than 55% coverage: worse than Syria but better than Ethiopia.

Haugen criticized the company’s internal policy of acting “only once a crisis has begun.”

Facebook “slows the platform down instead of watching as the temperatur­e gets hotter, and making the platform safer as that happens,” she told Britain’s Parliament during testimony on Oct. 25.

Frankel said Facebook has been proactive.

“Facebook’s approach in Myanmar today is fundamenta­lly different from what it was in 2017, and allegation­s that we have not invested in safety and security in the country are wrong,” he said.

Yet, a September report by the Myanmar Social Media Insights Project found that posts on Facebook include coordinate­d targeting of activists, ethnic minorities and journalist­s — a tactic that has roots in the military’s history. The report also said the military is laundering its propaganda through public pages that claim to be media outlets.

Opposition and pro-military groups have used the encrypted messaging app Telegram to organize two types of propaganda campaigns on Facebook and Twitter, according to an October report shared with the AP by Myanmar Witness, a U.K.-based organizati­on that archives social media posts related to the conflict.

Takedown

Myanmar is a “highly contested informatio­n environmen­t,” where users working in concert overload Facebook’s reporting system to take down others’ posts, and also spread coordinate­d misinforma­tion and hate speech, the report said.

In one example, the coordinate­d networks took video shot in Mexico three years ago by the Sinaloa cartel of butchered bodies and falsely labeled it as evidence of the opposition killing Myanmar soldiers on June 28, said Benjamin Strick, director of investigat­ions for Myanmar Witness.

“There’s a difficulty in catching it for some of these platforms that are so big and perhaps the teams to look for it are so small that it’s very hard to catch water when it’s coming out of a fire hydrant,” he said.

The organizati­on also traced the digital footprint of one soldier at the incinerati­on of 160 homes in the village of Thantlang in October. He posed in body armor on a ledge overlookin­g burning homes, with a post blaming opposition forces for the destructio­n in a litany of violent speech.

Facebook “conducted human rights due diligence to understand and address the risks in Myanmar,” and banned the military and used technology to reduce the amount of violating content, Frankel said.

Yet Myanmar digital rights activists and scholars say Facebook could still take steps to improve, including greater openness about its policies for content moderation, demotion and removal, and acknowledg­ing its responsibi­lities toward the Myanmar people.

“We need to start examining damage that has been done to our communitie­s by platforms like Facebook. They portray that they are a virtual platform, and thus can have lower regulation,” said Lee, the visiting scholar. “The fact is that there are real-world consequenc­es.”

 ?? AP ?? Masked demonstrat­ors on April 4 in Yangon, Myanmar. A military takeover on Feb. 1 has resulted in gruesome human rights abuses across the Southeast Asian nation.
AP Masked demonstrat­ors on April 4 in Yangon, Myanmar. A military takeover on Feb. 1 has resulted in gruesome human rights abuses across the Southeast Asian nation.
 ?? ALEX BRANDON/AP ?? Frances Haugen, a former Facebook employee-turned-whistleblo­wer, speaks during a Senate hearing on Oct. 5 in Washington.
ALEX BRANDON/AP Frances Haugen, a former Facebook employee-turned-whistleblo­wer, speaks during a Senate hearing on Oct. 5 in Washington.

Newspapers in English

Newspapers from United States