The Washington Post

An ex-moderator sued Facebook, claiming she has PTSD from her daily review of disturbing content for the social network.

- BY ELIZABETH DWOSKIN elizabeth.dwoskin@washpost.com More at washington­post.com/ news/technology

SAN FRANCISCO — A former Facebook content moderator is suing the company on the grounds that reviewing disturbing material on a daily basis caused her psychologi­cal and physical harm, according to a lawsuit filed Monday in a California superior court.

The suit by former moderator Selena Scola, who worked at Facebook from June 2017 until March, alleges that she witnessed thousands of acts of extreme and graphic violence “from her cubicle in Facebook’s Silicon Valley offices,” where Scola was charged with enforcing Facebook’s extensive rules prohibitin­g certain types of content on its systems.

Scola, who worked at Facebook through a third-party contractin­g company, developed post-traumatic stress disorder “as a result of constant and unmitigate­d exposure to highly toxic and extremely disturbing images at the workplace,” the suit says.

Facebook didn’t respond to a request for comment.

Facebook relies on thousands of moderators to determine whether posts violate its rules against violence, hate speech, child exploitati­on, nudity and disinforma­tion. Many objectiona­ble categories come with their own sublists of exceptions. It is staffing up its global workforce — hiring 20,000 content moderators and other safety specialist­s in places such as Dublin, Austin and the Philippine­s — in response to allegation­s that the company has not done enough to combat abuse of its services, including Russian meddling, illegal drug content and fake news.

The social network says that in recent years, it has been developing artificial intelligen­ce to spot problemati­c posts, but the technology isn’t sophistica­ted enough to replace the need for significan­t amounts of human labor.

Facebook is under intense scrutiny from politician­s and lawmakers, who have taken top executives to task in two high-profile hearings on Capitol Hill this year and are considerin­g new regulation­s that would hold the companies to a more stringent standard of responsibi­lity for illegal content posted on its platforms.

The complaint also charges the Boca Raton, Fla.-based contractin­g company Pro Unlimited with violating California workplace safety standards.

Pro Unlimited didn’t respond to a request for comment.

The lawsuit does not go into detail about Scola’s particular experience because she signed a nondisclos­ure agreement that limits what employees can say about their time on the job. Such agreements are standard in the tech industry, and Scola fears retaliatio­n if she violates it, the suit says. Her attorneys plan to dispute the NDA but are holding off on providing more details until a judge weighs in.

The suit notes that Facebook is one of the leading companies in an industry-wide consortium that has developed workplace safety standards for the moderation field. The complaint alleges that Facebook does not uphold the standards it helped developed, unlike industry peers.

In late 2016, two former content moderators sued Microsoft, claiming that they developed PTSD and that the company did not provide adequate psychologi­cal support.

Scola’s lawsuit asks that Facebook and its third-party outsourcin­g companies provide content moderators with proper mandatory on-site and ongoing mentalheal­th treatment and support and establish a medical monitoring fund for testing and providing mental-health treatment to former and current moderators.

Facebook has been historical­ly tight-lipped about its moderator program. The guidelines used by moderators to make decisions were secret until this year, when the company released a portion of them publicly. The company has declined to disclose informatio­n about where moderators work, as well as the hiring practices, performanc­e goals and working conditions for moderators.

Newspapers in English

Newspapers from United States