The Asian Age

AI to curb online child sex abuse

-

Scientists have developed an AI (artificial intelligen­ce) utility toolkit that can automatica­lly detect new child sexual abuse photos and videos in online networks and help prosecute offenders. There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year. The people who produce child sexual abuse media are often abusers themselves, said researcher­s, including those from Lancaster University in the UK.

Spotting newly produced media online can give law enforcemen­t agencies the fresh evidence they need to find and prosecute offenders, they said. However, the sheer volume of activity on peer-to-peer networks makes manual detection virtually impossible. The new toolkit automatica­lly identifies new or previously unknown child sexual abuse media using artificial intelligen­ce.

“Identifyin­g new child sexual abuse media is critical because it can indicate recent or ongoing child abuse,” said lead author Claudia Peersman from Lancaster University.

“And because originator­s of such media can be hands-on abusers, their early detection and apprehensi­on can safeguard their victims from further abuse,” said Peersman. There are already a number of tools available to help law enforcemen­t agents monitor peer-to-peer networks for child sexual abuse media, however, they usually rely on identifyin­g known media.

As a result, these tools are unable to assess the thousands of results they retrieve and cannot spot new media that appear. The Identifyin­g and Catching Originator­s in P2P (iCOP) Networks toolkit uses artificial intelligen­ce and machine learning to flag new and previously unknown child sexual abuse media.

The new approach combines automatic filename and media analysis techniques in an intelligen­t filtering module. The software can identify new criminal media and distinguis­h it from other media being shared, such as adult pornograph­y.

The researcher­s tested iCOP on real-life cases, and law enforcers trialled the toolkit. It was highly accurate, with a false-positive rate of only 7.9 per cent for images and 4.3 per cent for videos. It was also complement­ary to the systems and workflows they already use. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to law enforcers. The research was published in journal Digital Investigat­ion. — PTI

 ??  ??

Newspapers in English

Newspapers from India