An ex-moder­a­tor sued Face­book, claim­ing she has PTSD from her daily re­view of dis­turb­ing con­tent for the so­cial net­work.

The Washington Post - - FRONT PAGE - BY EL­IZ­A­BETH DWOSKIN el­iz­a­[email protected]­post.com More at wash­ing­ton­post.com/ news/tech­nol­ogy

SAN FRAN­CISCO — A for­mer Face­book con­tent moder­a­tor is su­ing the com­pany on the grounds that re­view­ing dis­turb­ing ma­te­rial on a daily ba­sis caused her psy­cho­log­i­cal and phys­i­cal harm, ac­cord­ing to a law­suit filed Mon­day in a Cal­i­for­nia su­pe­rior court.

The suit by for­mer moder­a­tor Se­lena Scola, who worked at Face­book from June 2017 un­til March, al­leges that she wit­nessed thou­sands of acts of ex­treme and graphic vi­o­lence “from her cu­bi­cle in Face­book’s Sil­i­con Val­ley of­fices,” where Scola was charged with en­forc­ing Face­book’s ex­ten­sive rules pro­hibit­ing cer­tain types of con­tent on its sys­tems.

Scola, who worked at Face­book through a third-party con­tract­ing com­pany, de­vel­oped post-trau­matic stress dis­or­der “as a re­sult of con­stant and un­mit­i­gated ex­po­sure to highly toxic and ex­tremely dis­turb­ing images at the work­place,” the suit says.

Face­book didn’t re­spond to a re­quest for com­ment.

Face­book re­lies on thou­sands of mod­er­a­tors to de­ter­mine whether posts vi­o­late its rules against vi­o­lence, hate speech, child ex­ploita­tion, nu­dity and dis­in­for­ma­tion. Many ob­jec­tion­able cat­e­gories come with their own sub­lists of ex­cep­tions. It is staffing up its global work­force — hir­ing 20,000 con­tent mod­er­a­tors and other safety spe­cial­ists in places such as Dublin, Austin and the Philip­pines — in re­sponse to al­le­ga­tions that the com­pany has not done enough to com­bat abuse of its ser­vices, in­clud­ing Rus­sian med­dling, il­le­gal drug con­tent and fake news.

The so­cial net­work says that in re­cent years, it has been de­vel­op­ing ar­ti­fi­cial in­tel­li­gence to spot prob­lem­atic posts, but the tech­nol­ogy isn’t so­phis­ti­cated enough to re­place the need for sig­nif­i­cant amounts of hu­man la­bor.

Face­book is un­der in­tense scru­tiny from politi­cians and law­mak­ers, who have taken top ex­ec­u­tives to task in two high-pro­file hear­ings on Capi­tol Hill this year and are con­sid­er­ing new reg­u­la­tions that would hold the com­pa­nies to a more strin­gent stan­dard of re­spon­si­bil­ity for il­le­gal con­tent posted on its plat­forms.

The com­plaint also charges the Boca Ra­ton, Fla.-based con­tract­ing com­pany Pro Un­lim­ited with vi­o­lat­ing Cal­i­for­nia work­place safety stan­dards.

Pro Un­lim­ited didn’t re­spond to a re­quest for com­ment.

The law­suit does not go into de­tail about Scola’s par­tic­u­lar ex­pe­ri­ence be­cause she signed a nondis­clo­sure agree­ment that lim­its what em­ploy­ees can say about their time on the job. Such agree­ments are stan­dard in the tech in­dus­try, and Scola fears re­tal­i­a­tion if she vi­o­lates it, the suit says. Her at­tor­neys plan to dis­pute the NDA but are hold­ing off on pro­vid­ing more de­tails un­til a judge weighs in.

The suit notes that Face­book is one of the lead­ing com­pa­nies in an in­dus­try-wide con­sor­tium that has de­vel­oped work­place safety stan­dards for the mod­er­a­tion field. The com­plaint al­leges that Face­book does not up­hold the stan­dards it helped de­vel­oped, un­like in­dus­try peers.

In late 2016, two for­mer con­tent mod­er­a­tors sued Mi­crosoft, claim­ing that they de­vel­oped PTSD and that the com­pany did not pro­vide ad­e­quate psy­cho­log­i­cal sup­port.

Scola’s law­suit asks that Face­book and its third-party out­sourc­ing com­pa­nies pro­vide con­tent mod­er­a­tors with proper manda­tory on-site and on­go­ing men­tal­health treat­ment and sup­port and es­tab­lish a med­i­cal mon­i­tor­ing fund for test­ing and pro­vid­ing men­tal-health treat­ment to for­mer and cur­rent mod­er­a­tors.

Face­book has been his­tor­i­cally tight-lipped about its moder­a­tor pro­gram. The guide­lines used by mod­er­a­tors to make de­ci­sions were se­cret un­til this year, when the com­pany re­leased a por­tion of them pub­licly. The com­pany has de­clined to dis­close in­for­ma­tion about where mod­er­a­tors work, as well as the hir­ing prac­tices, per­for­mance goals and work­ing con­di­tions for mod­er­a­tors.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.