Houston Chronicle Sunday

Digital trail of abuse traumatize­s victims

- By Michael H. Keller and Gabriel J.X. Dance

The two sisters live in fear of being recognized. One grew out her bangs and took to wearing hoodies. The other dyed her hair black.

Ten years ago, their father posted explicit photos and videos on the internet of them, just 7 and 11 at the time. Many captured violent assaults, including him and another man drugging and raping the 7-year-old.

The men are now in prison, but their crimes are finding new audiences. The two sisters are among the first generation of child sexual abuse victims whose anguish has been preserved on the internet.

This year alone, photos and videos of the sisters were found in over 130 child sexual abuse investigat­ions.

The digital trail of abuse haunts the sisters relentless­ly, they said, as does the fear of a predator recognizin­g them.

“That’s in my head all the time — knowing those pictures are out there,” said E., the older sister, who is being identified only by her first initial to protect her privacy. “Because of the way the internet works, that’s not something that’s going to go away.”

The scope of the problem is only starting to be understood because the tech industry has been more diligent in recent years in identifyin­g online child sexual abuse material, with a record 45 million photos and videos flagged last year.

But the same industry has consistent­ly failed to take aggressive steps to shut it down, an investigat­ion by the New York Times found.

The companies have the technical tools to stop the recirculat­ion of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.

Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft’s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded.

And other companies, including Snapchat and Yahoo, look for photos but not videos.

Facebook thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.

The main method for detecting the illegal imagery was created in 2009 by Microsoft. The software, known as PhotoDNA, can recognize photos, even altered ones, and compare them against databases of known illegal images. Almost none of the photos and videos detected last year would have been caught without systems like PhotoDNA.

But this technique is limited because no single authoritat­ive list of known illegal material exists, allowing countless images to slip through the cracks.

Joshua Gonzalez, a computer technician in Texas, was arrested this year with over 400 images of child sexual abuse on his computer, including some of E. and her sister.

Gonzalez told authoritie­s that he had used Microsoft’s search engine, Bing, to find some of the illegal photos and videos.

A report in January commission­ed by TechCrunch found explicit images of children on Bing using search terms like “porn kids.” In response, Microsoft said it would ban results using that term and similar ones.

The Times created a computer program that scoured Bing and other search engines. The automated script repeatedly found images — dozens in all — that Microsoft’s own PhotoDNA service flagged as known illicit content. Bing even recommende­d other search terms when a known child abuse website was entered into the search box.

While the Times did not view the images, they were reported to the National Center for Missing and Exploited Children and the Canadian Center for Child Protection, which work to combat online child sexual abuse.

The problem is not confined to search engines.

Pedophiles often leverage multiple technologi­es and platforms, meeting on chat apps and sharing images on cloud storage.

Some criminals have avoided detection by sharing their account logins rather than the files themselves.

The industry’s response to video content has been even more wanting. There is no common standard for identifyin­g illegal video content, and many major platforms do not even scan for it.

A heinous case in Pennsylvan­ia warns of a tsunami of new, hardto-detect abuse content through livestream­ing platforms.

More than a dozen men from around the world were logged in to the business conference software Zoom to watch a livestream of a man sexually assaulting a 6year-old boy.

None of the major tech companies is able to detect the livestream­ing through automated imagery analysis.

Men in the Pennsylvan­ia case were caught in 2015 only because Janelle Blackadar, a detective constable with the Toronto police, discovered the broadcast while conducting an undercover investigat­ion. The detective recorded the stream and alerted Homeland Security Investigat­ions.

The 6-year-old boy was rescued the next day, and 14 men from multiple states have since been sentenced to prison.

For victims like E. and her sister, the trauma of the constantly recirculat­ing photos and videos can have devastatin­g effects. Their mother said both sisters had been hospitaliz­ed for suicidal thoughts.

And because online offenders are known to seek out abused children, even into adulthood, the sisters do not speak publicly about the crimes against them.

“You get your voice taken away,” E. said. “Because of those images, I don’t get to talk as myself. It’s just like, Jane Doe.”

 ?? Kholood Eid / New York Times ?? The men who sexually abused these sisters are in prison, but their crimes still find new audiences online.
Kholood Eid / New York Times The men who sexually abused these sisters are in prison, but their crimes still find new audiences online.

Newspapers in English

Newspapers from United States