The Denver Post - - BUSINESS - By Tracy Jan and El­iz­a­beth Dwoskin

Fran­cie La­tour was pick­ing out pro­duce in a sub­ur­ban Bos­ton gro­cery store when a white man leaned to­ward her two young sons and, just loudly enough for the boys to hear, un­leashed a pro­fan­ity-laced racist ep­i­thet.

Reel­ing, La­tour, who is black, turned to Face­book to vent, in a post that was ex­plicit about the hate­ful words hurled at her 8- and 12-year-olds on a Sun­day evening in July.

“I couldn’t tol­er­ate just sit­ting with it and be­ing silent,” La­tour said. “I felt like I was going to jump out of my skin, like my kids’ in­no­cence was stolen in the blink of an eye.”

But within 20 min­utes, Face­book deleted her post, send­ing La­tour a cur­sory mes­sage that her con­tent had vi­o­lated com­pany stan­dards. Only two friends had got­ten the chance to voice their dis­be­lief and out­rage.

Ex­pe­ri­ences like La­tour’s ex­em­plify the chal­lenges Face­book chief ex­ec­u­tive Mark Zucker­berg con­fronts as he tries to re­brand his com­pany as a safe space for com­mu­nity, ex­pand­ing on its ear­lier goal of con­nect­ing friends and fam­ily.

But in mak­ing de­ci­sions about the lim­its of free speech, Face­book of­ten fails the racial, re­li­gious and sex­ual mi­nori­ties Zucker­berg says he wants to pro­tect.

The 13-year-old so­cial net­work is wrestling with the hard­est ques­tions it has ever faced as the de facto ar­biter of speech for the third of the world’s pop­u­la­tion that now logs on each month.

In Fe­bru­ary, amid mount­ing con­cerns over Face­book’s role in the spread of vi­o­lent live videos and fake news, Zucker­berg said the plat­form had a re­spon­si­bil­ity to “mit­i­gate the bad” ef­fects of the ser­vice in a more dan­ger­ous and di­vi­sive po­lit­i­cal era. In June, he of­fi­cially changed Face­book’s mission from con­nect­ing the world to com­mu­nity-build­ing.

The com­pany says it now deletes about 288,000 hate­speech posts a month.

But ac­tivists say that Face­book’s cen­sor­ship stan­dards are so un­clear and bi­ased that it is im­pos­si­ble to know what one can or can­not say.

The re­sult: Mi­nor­ity groups say they are dis­pro­por­tion­ately cen­sored when they use the so­cial­me­dia plat­form to call out racism or start di­a­logues. In the case of La­tour and her fam­ily, she was sim­ply re­peat­ing what the man who ver­bally as­saulted her chil­dren.

Cen­sor­ing posts

Com­pound­ing their pain, Face­book will of­ten go from cen­sor­ing posts to lock­ing users out of their ac­counts for 24 hours or more, without ex­pla­na­tion – a punishment known among ac­tivists as “Face­book jail.”

“In the era of mass in­car­cer­a­tion, you come into this dig­i­tal space — this one space that seems safe – and then you get at­tacked by the trolls and put in Face­book jail,” said Stacey Pat­ton, a jour­nal­ism pro­fes­sor at Mor­gan State Univer­sity, a his­tor­i­cally black univer­sity in Bal­ti­more. “It to­tally con­tra­dicts Mr. Zucker­berg’s mission to cre­ate a pub­lic square.”

In June, the com­pany said that nearly 2 bil­lion peo­ple now log onto Face­book each month. With the com­pany’s dra­matic growth comes the chal­lenge of main­tain­ing in­ter­nally con­sis­tent stan­dards as its con­tent mod­er­a­tors are faced with a grow­ing num­ber of judg­ment calls.

“Face­book is reg­u­lat­ing more hu­man speech than any gov­ern­ment does now or ever has,” said Su­san Be­nesch, di­rec­tor of the Dan­ger­ous Speech Project, a non­profit group that re­searches the in­ter­sec­tion of harm­ful on­line con­tent and free speech.

The com­pany has promised to hire 3,000 more con­tent mod­er­a­tors be­fore the year’s end, bring­ing the to­tal to 7,500, and is look­ing to im­prove the soft­ware it uses to flag hate speech, a spokes­woman said.

“We know this is a prob­lem,” said Face­book spokes­woman Ruchika Budhraja, adding that the com­pany has been meet­ing with com­mu­nity ac­tivists for sev­eral years. “We’re work­ing on evolv­ing not just our poli­cies but our tools. We are lis­ten­ing.”

Two weeks af­ter Don­ald Trump won the pres­i­dency, Zahra Bil­loo, ex­ec­u­tive di­rec­tor of the Coun­cil on Amer­i­can-is­lamic Re­la­tions’ of­fice for the San Fran­cisco Bay area, posted to Face­book an im­age of a hand­writ­ten let­ter mailed to a San Jose mosque and quoted from it: “He’s going to do to you Mus­lims what Hitler did to the Jews.”

The post – made to four Face­book ac­counts – con­tained a no­ta­tion clar­i­fy­ing that the state­ment came from hate mail sent to the mosque, as Face­book guide­lines ad­vise.

Face­book re­moved the post from two of the ac­counts – Bil­loo’s per­sonal page and the coun­cil’s lo­cal chap­ter page – but al­lowed iden­ti­cal posts to re­main on two oth­ers – the or­ga­ni­za­tion’s na­tional page and Bil­loo’s pub­lic one. The civil rights at­tor­ney was baf­fled. Af­ter she re-posted the mes­sage on her per­sonal page, it was again re­moved, and Bil­loo got a no­tice say­ing she would be locked out of Face­book for 24 hours.

“How am I sup­posed to do my work of chal­leng­ing hate if I can’t even share in­for­ma­tion show­ing that hate?” she said.

Bil­loo even­tu­ally re­ceived an au­to­mated apol­ogy from Face­book, and the post was re­stored to the lo­cal chap­ter page – but not her per­sonal one.

“Face­book” jail

Be­ing put in “Face­book jail” has be­come a reg­u­lar oc­cur­rence for Shan­non Hall-bul­zone, a San Diego pho­tog­ra­pher. In June 2016, Hall-bul­zone was shut out for three days af­ter post­ing an an­gry screed when she and her tod­dler were called a racist name as they walked to day care and her sis­ter was called an­other one as she walked to work. Within hours, Face­book re­moved the post.

In Jan­uary, a coali­tion of more than 70 civil rights groups wrote a let­ter urg­ing Face­book to fix its “racially­bi­ased” con­tent mod­er­a­tion sys­tem. The groups asked Face­book to en­able an ap­peals process, of­fer ex­pla­na­tions for why posts are taken down, and pub­lish data on the types of posts that get taken down and re­stored. Face­book has not done these things.

Like most so­cial me­dia com­pa­nies in Sil­i­con Val­ley, Face­book has long re­sisted be­ing a gate­keeper for speech. For years, Zucker­berg in­sisted that the so­cial net­work had only min­i­mal re­spon­si­bil­i­ties for polic­ing con­tent.

In its early years, Face­book’s in­ter­nal guide­lines for mod­er­at­ing and cen­sor­ing con­tent amounted to only a sin­gle page. The in­struc­tions in­cluded pro­hi­bi­tions on nu­dity and im­ages of Hitler, ac­cord­ing to a trove of doc­u­ments pub­lished by the in­ves­tiga­tive news out­let Prop­ub­lica. (Holo­caust de­nial was al­lowed.)

By 2015, the in­ter­nal cen­sor­ship man­ual had grown to 15,000 words, ac­cord­ing to Prop­ub­lica.

In Face­book’s guide­lines for mod­er­a­tors, ob­tained by Prop­ub­lica in June and af­firmed by the so­cial net­work, the rules pro­tect broad classes of peo­ple but not sub­groups. Posts crit­i­ciz­ing white or black peo­ple would be pro­hib­ited, while posts at­tack­ing white or black chil­dren, or rad­i­cal­ized Mus­lim sus­pects, may be al­lowed to stay up be­cause the com­pany sees “chil­dren” and “rad­i­cal­ized Mus­lims” as sub­groups.

The com­pany has ac­knowl­edged that mi­nori­ties feel dis­pro­por­tion­ately tar­geted but said it could not ver­ify those claims be­cause it does not cat­e­go­rize the types of hate speech that ap­pear or tally which groups are tar­geted.

As for La­tour, the Bos­ton mother was sur­prised when Face­book re­stored her post about the hate­ful words spewed at her sons, less than 24 hours af­ter it dis­ap­peared. The com­pany sent her an au­to­mated no­tice that a mem­ber of its team had re­moved her post in er­ror. There was no fur­ther ex­pla­na­tion.

The ini­tial cen­sor­ing of La­tour’s ex­pe­ri­ence “felt al­most ex­actly like what hap­pened to my sons writ large,” she said. The man had un­leashed the racial slur so qui­etly that for ev­ery­one else in the store, the ver­bal at­tack never hap­pened. But it had ter­ri­fied her boys.

“They were left with all that ug­li­ness and hate,” she said, “and when I tried to share it so that peo­ple could see it for what it is, I was shut down.”

Nick Otto, Spe­cial to The Wash­ing­ton Post

Zahra Bil­loo, of the Coun­cil on Amer­i­can-is­lamic Re­la­tions, says she posted a threat­en­ing let­ter re­ceived by a San Jose, Calif., mosque on four Face­book ac­counts. She was baf­fled when the com­pany re­moved it from two and left it up on two oth­ers.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.