Der Standard

Online, Images Of Abuse Proliferat­e

- By MICHAEL H. KELLER and GABRIEL J.X. DANCE

The two sisters live in fear of being recognized.

Ten years ago, their father did the unthinkabl­e: He posted explicit photos and videos on the internet of them, just 7 and 11 at the time. Many captured violent assaults in their home in the American Midwest, including him and another man drugging and raping the 7-year-old.

The men are now in prison, but their crimes are finding new audiences. This year alone, photos and videos of the sisters were found in over 130 child sexual abuse investigat­ions involving mobile phones, computers and cloud storage accounts.

The digital trail of abuse — often stored on Google Drive, Dropbox and Microsoft OneDrive — haunts the sisters relentless­ly, they say, as does the fear of a predator recognizin­g them from the images.

“That’s in my head all the time — knowing those pictures are out there,” said E., the older sister, who is being identified by her first initial. “Because of the way the internet works, that’s not something that’s going to go away.”

Horrific experience­s like theirs are being recirculat­ed across the internet because search engines, social networks and cloud storage are rife with opportunit­ies for criminals to exploit.

The tech industry has been more diligent in recent years in identifyin­g

online child sexual abuse material, with a record 45 million photos and videos flagged last year. But the same industry has consistent­ly failed to take aggressive steps to shut it down, an investigat­ion by The New York Times found. The companies have the technical tools to stop the recirculat­ion of abuse imagery, yet they do not take full advantage of them.

Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage, according to federal authoritie­s, and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft’s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded.

And other companies, including Snapchat and Yahoo, look for photos but not videos. (When asked about its video scanning, a Dropbox spokeswoma­n in July said it was not a “top priority.” On November 7, the company said it had begun scanning some videos last month.)

The largest social network in the world, Facebook, scans its platforms, accounting for over 90 percent of the imagery flagged last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.

“Each company is coming up with their own balance of privacy versus safety, and they don’t want to do so in public,” said Alex Stamos, who served as chief of informatio­n security at both Facebook and Yahoo. “These decisions actually have a humongous impact on children’s safety.”

The main method for detecting the illegal imagery was created in 2009 by Microsoft and Hany Farid, now a professor at the University of California, Berkeley. The software, known as PhotoDNA, can use computers to recognize photos, even altered ones, and compare them against databases of known illegal images. But this technique is limited because no single authoritat­ive list of known illegal material exists.

Even if there were a single list, however, it would not solve the problems of newly created imagery, or the surge in live-streaming abuse.

For victims like E. and her sister, F., the trauma of the constantly recirculat­ing photos and videos can have devastatin­g effects. Their mother said both sisters had been hospitaliz­ed for suicidal thoughts.

And because online offenders are known to seek out abused children, the sisters do not speak publicly.

“You get your voice taken away,” E. said. “Because of those images, I don’t get to talk as myself. It’s just like, Jane Doe.”

Searching for Abuse

Joshua Gonzalez, a computer technician in Texas, was arrested this year with over 400 images of child sexual abuse on his computer, including some of E. and her sister. Mr. Gonzalez said that he had used Microsoft’s search engine, Bing, to find some of the illegal photos and videos, according to court documents.

The Times created a computer program that scoured Bing and other search engines. The automated script found images — dozens in all — that Microsoft’s own PhotoDNA service flagged as known illicit content. Bing even recommende­d other search terms when a known child abuse website was entered into the search box. Pedophiles have also deployed the site’s “reverse image search” feature, which retrieves pictures based on a sample photo.

After reviewing The Times’s findings, Microsoft said it uncovered a flaw in its scanning practices and was re-examining its search results. But subsequent runs of the program found even more.

The same computer program, when run by The Times on Google’s search engine, did not return abuse content. But documentat­ion provided by the Canadian Center for Child Protection showed that images of child sexual abuse had also been found on Google and that the company had sometimes resisted removing them.

One image captured the midsection­s of two children forced into explicit acts with each other. It is part of a known series of photos showing the children being sexually exploited.

The Canadian center asked Google to take down the image last August, but Google said it did not meet its threshold for removal, the documents show. Google eventually relented.

Another image, found in September 2018, depicts a woman touching the genitals of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, “it’s not illegal in the United States.” It later admitted it made a mistake.

When The Times asked Google about the image and others identified by the Canadians, a spokesman acknowledg­ed they should have been removed, and subsequent­ly were.

A week after the images were removed, the Canadian center reported two additional images. Google told the Canadian center that neither image met “the reporting threshold,” but later agreed to remove them.

“It baffles us,” said Lianna McDonald, the center’s executive director.

Criminals Everywhere

The problem is not confined to search engines. Pedophiles often leverage multiple technologi­es and platforms, meeting on chat apps and sharing images on cloud storage.

“The first thing people need to understand is that any system that allows you to share photos and videos is absolutely infested with child sexual abuse,” Mr. Stamos said . Criminals often discuss in online forums and chat groups how to exploit vulnerabil­ities in platforms, court cases show.

The digital trail that has followed one young abuse victim is representa­tive of the pattern. The girl, now a teenager living on the West Coast of the United States, does not know that footage of her abuse is on the internet. Her mother and stepfather wish it would stay that way.

“To her, the internet is looking up puppies,” her stepfather said.

Sex offenders frequently share photos and videos of girl’s abuse on sites that appear on Bing and elsewhere. When the images are detected, the F.B.I. notifies the girl’s family or their lawyer. Over the past four years, her family says, they have received over 350 notificati­ons about cases across the United States.

When the girl turns 18, she will become the legal recipient of reports about the material. At that point, her mother and stepfather hope, she will be better able to handle the news. They also hold out hope that the tech companies will have managed to remove the images from the internet by then.

“I would love to be able to tell her they were online,” her mother said, “but they are not anymore.”

‘Pictures Are Forever’

In one foster family with multiple victims, two of the foster daughters were filmed being raped by their father, while others were abused but not photograph­ed or filmed. The difference, their foster mother said, can be profound over time.

“They’re angry that those pictures are forever,” she said. “I don’t think they’ll ever be totally wiped out.”

The industry’s response to video content has been even more wanting. There is no common standard for identifyin­g illegal video content, and many major platforms — including

AOL, Snapchat and Yahoo — do not even scan for it. AOL and Yahoo did not respond to requests for comment. A Snap spokesman said the company was developing a solution.

Tech companies have known for years that videos of children being sexually abused are shared on their platforms, according to former employees at Microsoft, Twitter, Tumblr and other companies. In 2013, fewer than 50,000 videos were reported. Last year, companies referred more than 22 million to the National Center for Missing and Exploited Children, a United States nonprofit.

In 2017, the tech industry approved a process to make it easier for all companies to detect illicit material, according to confidenti­al emails and other documents that were part of a project run by the Technology Coalition, a group focused on child safety issues that includes most major companies.

One document notes the project’s justificat­ion: “Video has become as easy to create as images and no standard solution/process has been adopted by industry.”

But the plan has gone nowhere.

An Uncertain Future

The lack of action across the industry has allowed untold videos and images to remain on the internet and also allowed abuse content to thrive via live-streaming platforms.

Facebook, Google and Microsoft have said they are developing technologi­es that will find new photos and videos on their platforms, but that could take years.

Many companies, including Amazon, have cited customer privacy as a roadblock to fighting the abuse. Some, like Dropbox and Apple, also invoked security concerns when asked about their practices.

Several digital forensic experts and law enforcemen­t officials said the companies were being disingenuo­us in invoking security. Mr. Stamos, the former Facebook and Yahoo security chief, said the companies just “don’t want to advertise that they are open for business” to criminals.

“If they’re saying, ‘It’s a security problem,’ they’re saying that they don’t do it,” he said.

 ?? KHOLOOD EID FOR THE NEW YORK TIMES ?? A digital trail of sexual abuse images is haunting victims like these sisters.
KHOLOOD EID FOR THE NEW YORK TIMES A digital trail of sexual abuse images is haunting victims like these sisters.
 ?? KHOLOOD EID FOR THE NEW YORK TIMES ?? “It baffles us,” said Lianna McDonald, a child protective advocate, of Google’s image removal criteria.
KHOLOOD EID FOR THE NEW YORK TIMES “It baffles us,” said Lianna McDonald, a child protective advocate, of Google’s image removal criteria.

Newspapers in German

Newspapers from Austria