East Bay Times

Report urges fixes to CyberTipli­ne before AI worsens it

- By Barbara Ortutay and Matt O'Brien

A tipline set up 26 years ago to combat online child exploitati­on has not lived up to its potential and needs technologi­cal and other improvemen­ts to help law enforcemen­t go after abusers and rescue victims, a new report from the Stanford Internet Observator­y has found.

The fixes to what the researcher­s describe as an “enormously valuable” service must also come urgently as new artificial intelligen­ce technology threatens to worsen its problems.

“Almost certainly in the years to come, the CyberTipli­ne will just be flooded with highly realistic-looking AI content, which is going to make it even harder for law enforcemen­t to identify real children who need to be rescued,” said researcher Shelby Grossman, an author of the report.

The service was establishe­d by Congress as the main line of defense for children who are exploited online. By law, tech companies — must report any child sexual abuse material they find on their platforms to the system, which is operated by the National Center for Missing and Exploited Children. After it receives the reports, NCMEC attempts to find the people who sent or received the material — as well as the victims, if possible. These reports are then sent to law enforcemen­t.

While the sheer amount of CyberTipli­ne reports is overwhelmi­ng law enforcemen­t, researcher­s say volume is just one of several problems core to the system. For instance, many of the reports sent by tech companies lack important details, such as enough informatio­n about an offender's identity, the report said. This makes it hard for law enforcemen­t to know which reports to prioritize.

“There are significan­t issues with the entire system right now and those cracks are going to become chasms in a world in which AI is generating brand-new CSAM,” said Alex Stamos, using the initials for child sexual abuse materials. Stamos is a Stanford lecturer and cybersecur­ity expert.

The system is behind technologi­cally and plagued by a constant challenge among government and nonprofit tech platforms: the lack of highly skilled engineers, who can get paid far higher salaries in the tech industry. Sometimes those employees are even poached by the same companies that send in the reports.

Then there are legal constraint­s. According to the report, court decisions have led the staff at NCMEC to stop vetting some files (for instance, if they are not publicly available) before sending them to law enforcemen­t. Many law enforcemen­t officials believe they need a search warrant to access such images, slowing down the process. At times, multiple warrants or subpoenas are needed to identify the same offender.

It's also easy for the system to get distracted. The report reveals that NCMEC recently hit a milestone of a million reports in a single day due to a meme that was spreading on multiple platforms — which some people thought was funny and others were sharing out of outrage.

Newspapers in English

Newspapers from United States