Business Day

Staff cuts diminish team tackling child sexual abuse

- Olivia Solon and Jillian Deutsch

Elon Musk has dramatical­ly reduced the size of the Twitter team devoted to tackling child sexual exploitati­on on the platform, cutting the global team of experts in half and leaving behind an overwhelme­d skeleton crew, people familiar with the matter say.

The team now has fewer than 10 specialist­s to review and escalate reports of child sexual exploitati­on, three people familiar with the matter said, asking not to be identified.

At the beginning of the year, Twitter had a team of about 20, they said.

It comes as legislator­s in the EU and the UK are planning broad-reaching online safety rules that will require social media platforms to better protect children or face significan­t fines.

Twitter did not respond to a request for comment.

The team — a mix of former law enforcemen­t officers and child safety experts based in the US, Ireland and Singapore — was stretched before the cuts, working long hours to respond to user reports and legal requests, the people said. They were responsibl­e for stopping the distributi­on of child sexual abuse material, instances of online grooming and media that promoted attraction to minors as an identity or sexual orientatio­n.

Last week, Musk tweeted that “removing child exploitati­on is priority #1” and called on people to “reply in the comments if you see anything that Twitter needs to address”.

Some prominent hashtags associated with child sexual exploitati­on have been removed since Musk took over, changes that had been in the works before he joined, the people said. Still, combating this type of messaging is not always as simple as removing tweets containing the offending hashtags since many have another, innocuous purpose, they said. Offenders also constantly change the terms they use to evade detection.

While artificial intelligen­cebased tools can be useful for identifyin­g images that have already been reviewed and categorise­d as child sexual exploitati­on material by law enforcemen­t, human review is particular­ly important for understand­ing the nuances of grooming and other exploitati­ve behaviours, identifyin­g previously unknown abusive images and videos, and understand­ing regional difference­s in the law, the people said. Humans are also required to respond to requests from law enforcemen­t as part of criminal investigat­ions.

Losing specialist­s in Europe and Singapore will make policing non-English-speaking markets a particular challenge, the people said.

These specialist­s worked closely with dedicated product managers and engineers to build tools and automation to stop the spread of the material, as well as third-party contractor­s who helped triage posts that users reported. While only a few employees were cut in the first round of layoffs, the team was hollowed out when Musk called on Twitter’s workers to commit to a “hardcore” culture or lose their jobs, the people said. The people said Musk didn’t create an environmen­t where the team wanted to stay.

The defections were part of a broader exodus at Twitter’s trust and safety team, which left after Musk sent the ultimatum earlier in November, people familiar with the matter said previously.

The company also lost a number of its employees who block foreign disinforma­tion campaigns on the platform and entire swathes of Twitter’s audience have been left without content moderation, the people said. In the Asia-Pacific region, just one contractor hired to help with spam in the Korean market remained, the person said.

Twitter has also cut a number of contractor­s who helped moderate content, Axios has reported. Social media platforms including Facebook, TikTok and Twitter use thirdparty moderators to help sift flagged posts for violations.

Unlike other types of egregious content that violate Twitter’s rules, it’s illegal for the platform to host child sexual abuse material and, depending on the country, there are requiremen­ts to take down and report material within specific time limits.

In the UK, the Online Safety Bill gives regulators the power to fine platforms hosting usergenera­ted content, including Twitter and other social media apps, as much as 10% of their revenue if they fail to police their platforms effectivel­y.

The EU is also planning regulation that would require tech companies to take a more aggressive approach to detecting sexual abuse material.

The European Commission’s controvers­ial proposal would give courts the power to require companies to scan for material in messages, even if they are end-to-end encrypted. The commission also wants companies to detect grooming via artificial intelligen­ce and use age verificati­on to find minors on their platforms.

“Elon Musk has been very vocal about his commitment to tackling online child sexual abuse,” said Ylva Johansson, the EU commission­er in charge of the proposal. “I fully expect him to follow through on these public commitment­s.

“Having experience­d experts and teams in place, as well as those familiar with EU legislatio­n, seems to me an obvious baseline from which to scale up this fight,” she said.

 ?? ?? Skeleton crew: Elon Musk has dramatical­ly reduced the size of the Twitter team devoted to tackling child sexual exploitati­on on the platform
Skeleton crew: Elon Musk has dramatical­ly reduced the size of the Twitter team devoted to tackling child sexual exploitati­on on the platform

Newspapers in English

Newspapers from South Africa