The Prince George Citizen

All that can’t be unseen

Human cost paid by the people who shield you from the worst of the web

-

MANILA, Philippine­s – A year after quitting his job reviewing some of the most gruesome content the internet has to offer, Lester prays every week that the images he saw can be erased from his mind.

First as a contractor for YouTube and then for Twitter, he worked on a high-up floor of a mall in this traffic-clogged Asian capital, where he spent up to nine hours each day weighing questions about the details in those images. He made decisions about whether a child’s genitals were being touched accidental­ly or on purpose, or whether a knife slashing someone’s neck depicted a real-life killing – and if such content should be allowed online.

He’s still haunted by what he saw. Today, entering a tall building triggers flashbacks to the suicides he reviewed, causing him to entertain the possibilit­y of jumping. At night, he Googles footage of bestiality and incest – material he was never exposed to before but now is ashamed that he is drawn to. For the last year, he has visited a mall chapel every week, where he works with a church brother to ask God to “white out” those images from his memory.

“I know it’s not normal, but now everything is normalized,” said the 33-year-old, using only his first name because of a confidenti­ality agreement he signed when he took the job.

Workers such as Lester are on the front lines of the never-ending battle to keep the internet safe. But thousands of miles separate the Philippine­s and Silicon Valley, rendering these workers vulnerable to exploitati­on by some of the world’s tech giants.

In the last couple of years, social media companies have created tens of thousands of jobs around the world to vet and delete violent or offensive content, attempting to shore up their reputation­s after failing to adequately police content including live-streamed terrorist attacks and Russian disinforma­tion spread during the U.S. presidenti­al election. Yet the firms keep these workers at arm’s length, creating separation by employing them as contractor­s through giant outsourcin­g agencies.

Workers here say the companies do not provide adequate support to address the psychologi­cal consequenc­es of the work. They said that they cannot confide in friends because the confidenti­ality agreements they signed prevent them from doing so, that it is tough to opt out of content that they see, and that daily accuracy targets create pressure not to take breaks. The tech industry has acknowledg­ed the importance of allowing content moderators these freedoms – in 2015 signing on to a voluntary agreement to provide such options for workers who view child exploitati­on content, which most workers said they were exposed to.

The vulnerabil­ity of content moderators is most acute in the Philippine­s, one of the biggest and fastest-growing hubs of such work and an outgrowth of the country’s decades-old call centre industry. Unlike moderators in other major hubs, such as those in India or the United States, who mostly screen content that is shared by people in those countries, workers in offices around Manila evaluate images, videos and posts from all over the world. The work places enormous burdens on them to understand foreign cultures and to moderate content in up to 10 languages that they don’t speak, while making several hundred decisions a day about what can remain online.

In interviews with The Washington Post, 14 current and former moderators in Manila described a workplace where nightmares, paranoia and obsessive rumination­s were common consequenc­es of the job. Several described seeing colleagues suffer mental breakdowns at their desks. One of them said he attempted suicide as a result of the trauma.

Several moderators call themselves silent heroes of the internet, protecting Americans from the ills of their own society, and say they’ve become so consumed by the responsibi­lity of keeping the web safe that they look for harmful content in their free time to report it.

“At the end of a shift, my mind is so exhausted that I can’t even think,” said a Twitter moderator in Manila. He said he occasional­ly dreamed about being the victim of a suicide bombing or a car accident, his brain recycling images that he reviewed during his shift. “To do this job, you have to be a strong person and know yourself very well.”

The moderators worked for Facebook, Facebook-owned Instagram, Google-owned YouTube, Twitter and the Twitter-owned video-streaming platform Periscope, as well as other such apps, all through intermedia­ries such as Accenture and Cognizant. Each spoke on the condition of anonymity or agreed to the use of their first name only because of the confidenti­ality agreements they were required by their employers and the tech companies to sign.

In interviews, tech company officials said they had created many of the new jobs in a hurry, and acknowledg­ed that they were still grappling with how to offer appropriat­e psychologi­cal care and improve workplace conditions for moderators, while managing society’s outsize expectatio­ns that they quickly remove undesirabl­e content. The companies pointed to a series of changes they’ve made over the past year or so to address the harms. A Facebook counselor acknowledg­ed a form of PTSD known as vicarious trauma could be a consequenc­e of the work, and company training addresses the potential for suicidal thinking. They acknowledg­ed there were still disconnect­s between policies created in Silicon Valley and the implementa­tion of those policies at moderation sites run by third parties around the world.

While American workers view the job as a steppingst­one to a potential career in the tech industry, Filipino workers view the deadend job as one of the best they can get. They are also far more fearful of the consequenc­es of breaking their confidenti­ality agreements.

The jobs have strong attraction­s for many recent college graduates here. They offer relatively good pay and provide a rare ticket to the middle class. And many workers say they walk away without any negative effects.

Though tens of thousands of people around the world spend their days reviewing horrific content, there has been little formal study of the impact of content moderators’ routine exposure to such imagery. While many Filipino workers say they became content moderators because they thought it would be easier than customer service, the work is very different from other jobs because of “the stress of hours and hours of exposure to people getting hurt, to animals getting hurt, and all sorts of hurtful violent imagery,” said Sylvia Estrada-Claudio, dean of the College of Social Work and Community Developmen­t at the University of the Philippine­s, who has counseled workers in the call-center industry. She said she worries about a generation of young people exposed to such disturbing material. “For people with underlying issues, it can set off a psychologi­cal crisis.”

Lester said he had no control over what post he was going to see – whether the feed would show him an Islamic State murder or a child being forced into sex with an animal or an anti-Trump screed. He had no ability to blur or minimize the images, which are about the size of a postcard, or to toggle to a different screen for a mental breather, because the computer was not connected to the Internet.

“Too many to count,” he said, when asked to estimate the number of violent images he saw over the course of eight months of reviewing Twitter and YouTube content. One unforgetta­ble video, of what he assumed was a gangrelate­d murder in Africa, showed a group of men dragging a man into a forest and repeatedly slashing his throat with a large knife until blood covered the camera lens. Lester estimates that he reviewed roughly 10 murders a month. He reviewed at least 1,000 pieces of content related to suicide, he said, which included mostly photos and written posts of people crying for help or announcing their plans to kill themselves.

After a while, Lester noticed that his work was beginning to take a toll on his well-being. It was reigniting feelings of depression that he had struggled with in his 20s, and he started to think about suicide again.

He said he could not afford mental health treatment and didn’t seek it.

At the same time, he was ashamed and disturbed to discover that some of the new sexual imagery to which he was being exposed aroused him.

Lester, who now works in a call centre selling life insurance, says he has been pushing his former colleagues to quit moderation.

Content moderation, he said, should be left to the robots. “People think, because we’re Filipinos, we are happy people. We can adapt,” he said. “But this stays in our heads forever... They should turn these jobs into machines.”

 ?? WASHINGTON POST PHOTO ?? Lester, whose last name isn’t being used, sits on his bed, jotting down notes. He worked as a content moderator in the Philippine­s, which has caused emotional distress for him.
WASHINGTON POST PHOTO Lester, whose last name isn’t being used, sits on his bed, jotting down notes. He worked as a content moderator in the Philippine­s, which has caused emotional distress for him.
 ?? WASHINGTON POST PHOTO ?? The grandmothe­r of Lester, whose last name isn’t being used, watches him as he prepares his breakfast before leaving for work.
WASHINGTON POST PHOTO The grandmothe­r of Lester, whose last name isn’t being used, watches him as he prepares his breakfast before leaving for work.

Newspapers in English

Newspapers from Canada