Los Angeles Times

Twitter’s child abuse screening team decimated

Sources say fewer than 10 specialist­s are left to review reports of sexual exploitati­on.

- By Olivia Solon and Jillian Deutsch Solon and Deutsch write for Bloomberg. Bloomberg writers Davey Alba, Jack Gillum and Margi Murphy contribute­d to this report.

Elon Musk has dramatical­ly reduced the size of the Twitter team devoted to tackling child sexual exploitati­on on the platform, cutting the global team of experts in half and leaving behind an overwhelme­d skeleton crew, people familiar with the matter said.

The team now has fewer than 10 specialist­s to review and escalate reports of child sexual exploitati­on, three people familiar with the matter said, asking not to be identified for fear of retaliatio­n. At the beginning of the year, Twitter had a team of about 20, they said.

The change comes as lawmakers in the European Union and the U.K. are planning broad-reaching online safety rules that will require social media platforms to better protect children or face significan­t fines.

Twitter didn’t respond to a request for comment.

The team — a mix of former law enforcemen­t officers and child safety experts based in the U.S., Ireland and Singapore — was stretched before the cuts, working long hours to respond to user reports and legal requests, the people said. They were responsibl­e for stopping the distributi­on of child sexual abuse material, instances of online grooming, and media that promoted attraction to minors as an identity or sexual orientatio­n.

Last week, Musk tweeted that “removing child exploitati­on is priority #1” and called on people to “reply in the comments if you see anything that Twitter needs to address.”

Some prominent hashtags associated with child sexual exploitati­on have been removed since Musk took over, changes that had been in the works before he joined, the people said.

Still, combating this type of messaging isn’t always as simple as removing tweets containing the offending hashtags since many have other, innocuous purposes, they said. Offenders also constantly change the terms they use to evade detection.

Although artificial intelligen­ce-based tools can be useful for identifyin­g images that have already been reviewed and categorize­d as child sexual exploitati­on material by law enforcemen­t, human review is particular­ly important for recognizin­g the nuances of grooming and other exploitati­ve behaviors, identifyin­g previously unknown abusive images and videos, and understand­ing regional difference­s in the law, the people said.

Humans are also required to respond to requests from law enforcemen­t as part of criminal investigat­ions.

Losing specialist­s in Europe and Singapore will make policing non-Englishspe­aking markets a particular challenge, the people said.

These specialist­s worked closely with dedicated product managers and engineers to build tools and automation to stop the spread of the material, as well as thirdparty contractor­s who helped triage posts that users reported.

Only a few employees were cut in the first round of layoffs, but the team was decimated when Musk called on Twitter’s workers to commit to a “hardcore” culture or lose their jobs, the people said. Musk didn’t create an environmen­t where the team wanted to stay, the people said.

The defections were part of a broader exodus at Twitter’s trust and safety team, whose members left after Musk sent the ultimatum this month, people familiar with the matter said previously.

The company also lost a significan­t number of its employees who block foreign disinforma­tion campaigns on the platform, and entire swaths of Twitter’s audience have been left without content moderation, one of the people said. In the Asia-Pacific region, only one contractor hired to help with spam in the Korean market remained, the person said.

Twitter also has cut a number of contractor­s who helped moderate content, Axios has reported. Social media platforms including Facebook, TikTok and Twitter use third-party moderators to help sift through flagged posts for violations.

Unlike other types of egregious content that violates Twitter’s rules, child sexual abuse material is illegal to host on the platform, and, depending on the country, there are requiremen­ts to take down and report material within specific time limits.

In the U.K., the Online Safety Bill gives regulators the power to fine platforms hosting user-generated content as much as 10% of their revenue if they fail to police the content effectivel­y.

The EU is also planning regulation that would require tech companies to take a more aggressive approach to detecting sexual abuse material.

The European Commission’s controvers­ial proposal would give courts the power to require companies to scan for material in messages, even if they are end-to-end encrypted.

The commission also wants companies to detect grooming via artificial intelligen­ce and use age verificati­on to find minors on their platforms.

“Elon Musk has been very vocal about his commitment to tackling online child sexual abuse,” said Ylva Johansson, the EU commission­er in charge of the proposal. “I fully expect him to follow through on these public commitment­s.”

 ?? Tayfun Coskun Anadolu Agency ?? A GUARD opens a door at Twitter’s San Francisco offices. The team that tackles exploitati­ve content shrank significan­tly amid cuts and turmoil under Elon Musk.
Tayfun Coskun Anadolu Agency A GUARD opens a door at Twitter’s San Francisco offices. The team that tackles exploitati­ve content shrank significan­tly amid cuts and turmoil under Elon Musk.

Newspapers in English

Newspapers from United States