In­side Face­book’s se­cret rule book for global po­lit­i­cal speech

The Bradenton Herald (Sunday) - - Nation & World - BY MAX FISHER

MENLO PARK, CALIF.

In a glass con­fer­ence room at its Cal­i­for­nia head­quar­ters, Face­book is tak­ing on the bon­fires of hate and mis­in­for­ma­tion it has helped fuel across the world, one post at a time.

The so­cial net­work has drawn crit­i­cism for un­der­min­ing democ­racy and for pro­vok­ing blood­shed in so­ci­eties small and large.

But for Face­book, it’s also a busi­ness prob­lem.

The com­pany, which makes about $5 bil­lion in profit per quar­ter, has to show it is se­ri­ous about re­mov­ing dan­ger­ous con­tent. It must also con­tinue to at­tract more users from more coun­tries and try to keep them on the site longer.

How can Face­book mon­i­tor bil­lions of posts per day in over 100 lan­guages, all without dis­turb­ing the end­less ex­pan­sion that is core to its busi­ness? The com­pany’s so­lu­tion: a net­work of work­ers us­ing a maze of Pow­erPoint slides spell­ing out what’s for­bid­den.

Ev­ery other Tues­day morn­ing, sev­eral dozen Face­book em­ploy­ees gather to come up with the rules, hash­ing out what the site’s 2 bil­lion users should be al­lowed to say. The guide­lines that emerge from these meet­ings are sent to 7,500-plus mod­er­a­tors around the world.

The closely held rules are ex­ten­sive, and they make the com­pany a far more pow­er­ful ar­biter of global speech than has been pub­licly rec­og­nized or ac­knowl­edged by the com­pany it­self, The New York Times has found.

FATE­FUL ER­RORS

The Times was pro­vided with more than 1,400 pages from the rule books by an em­ployee who said he feared the com­pany was ex­er­cis­ing too much power, with too lit­tle over­sight – and mak­ing too many mis­takes.

An ex­am­i­na­tion of the files re­vealed nu­mer­ous gaps, bi­ases and out­right er­rors. As Face­book em- ploy­ees grope for the right an­swers, they have al­lowed ex­trem­ist lan­guage to flour­ish in some coun­tries while cen­sor­ing main­stream speech in oth­ers.

Mod­er­a­tors were once told, for ex­am­ple, to re­move fundrais­ing ap­peals for vol­cano vic­tims in In­done­sia be­cause a cospon­sor of the drive was on Face­book’s in­ter­nal list of banned groups. In Myan­mar, a pa­per­work er­ror al­lowed an ex­trem­ist group, ac­cused of fo­ment­ing geno­cide, to stay on the plat­form for months. In In­dia, mod­er­a­tors were mis­tak­enly told to take down com­ments crit­i­cal of re­li­gion.

The Face­book em­ploy­ees who meet to set the guide­lines, mostly young en­gi­neers and lawyers, try to dis­till highly com­plex is­sues into sim­ple yes-or-no rules. Then the com­pany out­sources much of the postby-post mod­er­a­tion to com­pa­nies that en­list largely un­skilled work­ers, many hired out of call cen­ters.

Those mod­er­a­tors, at times re­ly­ing on Google Trans­late, have mere sec­onds to re­call count­less rules and ap­ply them to the hun­dreds of posts that dash across their screens each day.

Mod­er­a­tors ex­press frus­tra­tion at rules they say don’t al­ways make sense and some­times re­quire them to leave up posts they fear could lead to vi­o­lence. “You feel like you killed some­one by not act­ing,” one said, speak­ing on the con­di­tion of anonymity be­cause he had signed a nondis­clo­sure agree­ment.

Face­book ex­ec­u­tives say they are work­ing dili­gently to rid the plat­form of dan­ger­ous posts.

“It’s not our place to cor­rect peo­ple’s speech, but we do want to en­force our com­mu­nity stan­dards on our plat­form,” said

Sara Su, a se­nior en­gi­neer on the News Feed. “When you’re in our com­mu­nity, we want to make sure that we’re balanc­ing free­dom of ex­pres­sion and safety.”

Monika Bick­ert, Face­book’s head of global pol­icy man­age­ment, said that the pri­mary goal was to pre­vent harm, and that to a great ex­tent, the com­pany had been suc­cess­ful. But per­fec­tion, she said, is not pos­si­ble.

SET­TING RULES

The guide­lines for iden­ti­fy­ing hate speech, a prob­lem that has be­dev­iled Face­book, run to 200 pages. Mod­er­a­tors must sort a post into one of three “tiers” of sever­ity. They must bear in mind lists like the six “des­ig­nated de­hu­man­iz­ing com­par­isons,” among them com­par­ing Jews to rats.

“There’s a real ten­sion here be­tween want­ing to have nu­ances to ac­count for ev­ery sit­u­a­tion, and want­ing to have set of poli­cies we can en­force ac­cu­rately and we can ex­plain cleanly,” said Bick­ert.

As de­tailed as the guide­lines can be, they are also ap­prox­i­ma­tions – best guesses at how to fight ex­trem­ism or dis­in­for­ma­tion. And they are lead­ing Face­book to in­trude into sen­si­tive po­lit­i­cal mat­ters the world over, some­times clum­sily.

In­creas­ingly, de­ci­sions on what posts should be barred amount to reg­u­lat­ing po­lit­i­cal speech – and not just on the fringes. In many coun­tries, ex­trem­ism and the main­stream are blur­ring.

In the U.S., Face­book banned the Proud Boys, a far-right pro-Trump group. The com­pany also blocked an in­flam­ma­tory ad, about a car­a­van of Cen­tral Amer­i­can mi­grants, that was pro­duced by Pres­i­dent Don­ald Trump’s po­lit­i­cal team.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.