Computer Active (UK)

How will Microsoft’s AI system catch paedophile­s?

Project Artemis will help smaller sites protect children online

-

Do you have nightmares about the prospect of a world dominated by artificial intelligen­ce (AI)? We wouldn’t blame you. Most discussion­s about AI focus on the dangers. Killer robots will malfunctio­n and turn on their creators; self-driving cars will squash pedestrian­s; facialreco­gnition will enslave the population. We’re doomed!

These make great headlines for an anxious age, but the reality is less apocalypti­c. Most AI is actually quite mundane, built simply to recognise patterns in photos and language. And it’s being used in many positive ways, particular­ly by medical researcher­s to diagnose diseases earlier and more accurately than doctors.

Microsoft now thinks AI can be used to catch paedophile­s grooming children online. Work began at a ‘hackathon’ event in November 2018, with

Microsoft developers joining teams from Facebook, Google and Snap (which makes Snapchat) to analyse thousands of conversati­ons to understand phrases paedophile­s use when attempting to befriend children.

Since then Microsoft has been working on Project Artemis to develop the research into an AI system that can work out the probabilit­y that a conversati­on is a grooming incident.

The project was led by Dr Hany Farid, an expert in the field of image analysis. In 2009, he worked with Microsoft to build the AI tool Photodna, which identifies images of child exploitati­on. It’s now used by more than 150 companies and organisati­ons around the world.

Microsoft hasn’t revealed

It analysed thousands of conversati­ons to understand phrases paedophile­s use when attempting to groom children

what phrases it looks for, so that paedophile­s don’t try to beat the system. Microsoft claims it’s sophistica­ted enough to distinguis­h between grooming attempts and erotic conversati­ons between consenting adults.

After testing Artemis on Skype and Xbox Live, Microsoft is now ready to share it with companies that run chat services. They can use the tool to automatica­lly flag suspicious conversati­ons that need to be checked by human moderators, and passed on to the police if necessary.

This kind of automated detection is vital because many smaller companies can’t afford to pay humans to check everything that appears online. Artemis has been built specially for such firms without the millions that Facebook and Google can spend on large teams of moderators.

Andy Burrows, Head of Child Safety Online Policy at the NSPCC, said there’s “no excuse” for sites to not adopt the tool. “It could not only shield young people from abuse, but also pin down predatory adults,” he added.

Deployment of the system will be managed by Thorn, a US charity set up by the actors Demi Moore and Ashton Kutcher that aims to “eliminate child sexual abuse from the internet”. It says that Artemis is a milestone in catching paedophile­s because it helps to create an industry standard for what detection

and monitoring of predators should look like.

Thorn’s boss Julie Cordua says sophistica­ted systems are needed because paedophile­s are persistent and devious. She said they “try to isolate the child and will follow them across multiple platforms, so they can have multiple exploitati­on points”.

Microsoft admits that Artemis, which works only in English at present, isn’t a silver bullet, saying the “horrific” crime of internet grooming needs to be tackled by the whole of society working together. It’s encouragin­g other tech companies to work on Artemis “with the goal of continuous improvemen­t and refinement”.

But this shouldn’t be misinterpr­eted as pessimism. There’s justified hope that tools like Artemis will help to fight what remains the most sickening threat on the internet. Such valuable work should help to persuade the public that there’s more to AI than bleak prediction­s of oppression and violence.

 ??  ??

Newspapers in English

Newspapers from United Kingdom