Child sex images growing online
Tech CEOs face a Senate hearing
One of the internet’s oldest, ugliest problems keeps getting worse.
Despite decades of efforts to crack down on sexual pictures and videos of children online, they’re more widely available now than ever, according to new data from the nonprofit tasked by the US government with tracking such material. John Shehan, head of the exploited children division at the National Center for Missing and Exploited Children, said reports of child sexual abuse material on online platforms grew from 32 million in 2022 to a record high of more than 36 million in 2023.
“The trends aren’t slowing down,” Shehan said.
On Wednesday, a high-profile hearing will spotlight the issue as the CEOs of tech companies Meta, X, TikTok, Snap, and Discord testify before the Senate Judiciary Committee on their respective efforts to combat child sexual abuse material.
But decrying the problem may prove to be easier than solving it. The diffuse nature of the internet, legal questions around free speech and tech company liability, and the fact that 90 percent of reported abuse material is uploaded by people outside the United States all complicate efforts to rein it in.
Senators are convening the hearing as they look to build support for a suite of bills intended to expand protections for children online, including a measure that would allow victims of child sexual abuse to sue platforms that facilitate exploitation. But the proposals have faced pushback from tech lobbyists and some digital rights groups, who argue they would undermine privacy protections and force platforms to inadvertently take down lawful posts. Other measures focus on giving prosecutors more tools to go after those who spread the material.
Preventing the sexual exploitation of kids is one of the rare issues with the potential to unite Republicans and Democrats. Yet over the years, technology has outpaced attempts at regulation.
From naked pictures of teens circulated without their consent to graphic videos of young children being sexually assaulted, the boom has been fueled by the ever-wider global availability of smartphones, surveillance devices, private messaging tools, and unmoderated online forums.
The dissemination of child sex images “has changed over the years, where it once was produced and exchanged in secretive online rings,” said Carrie Goldberg, a lawyer who specializes in sex crimes. “Now most kids have tools in the palm of their hands — i.e., their own phones — to produce it themselves.”
Increasingly, online predators take advantage of that by posing as a flirty peer on a social network or messaging app to entice teens to send compromising photos or videos of themselves. Then they use those as leverage to demand more graphic videos or money, a form of blackmail known as “sextortion.”
The human costs can be grave, with some victims being abducted, being forced into sex slavery, or killing themselves. Many others, Goldberg said, are emotionally scarred or live in fear of their images or videos being exposed to friends, parents, and the wider world. Sextortion schemes in particular, often targeting adolescent boys, have been linked to at least a dozen suicides, the Center for Missing and Exploited Children said last year.